SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 26012625 of 4002 papers

TitleStatusHype
USAAR-WLV: Hypernym Generation with Deep Neural Nets0
Usability and Accessibility of Bantu Language Dictionaries in the Digital Age: Mobile Access in an Open Environment0
Use Case: Romanian Language Resources in the LOD Paradigm0
Use Generalized Representations, But Do Not Forget Surface Features0
Use of unsupervised word classes for entity recognition: Application to the detection of disorders in clinical reports0
User recommendation system based on MIND dataset0
USF at SemEval-2019 Task 6: Offensive Language Detection Using LSTM With Word Embeddings0
USI-IR at IEST 2018: Sequence Modeling and Pseudo-Relevance Feedback for Implicit Emotion Detection0
Using Adversarial Debiasing to Remove Bias from Word Embeddings0
Controllable Speaking Styles Using a Large Language Model0
Using BERT Embeddings to Model Word Importance in Conversational Transcripts for Deaf and Hard of Hearing Users0
Using bilingual word-embeddings for multilingual collocation extraction0
Using Centroids of Word Embeddings and Word Mover's Distance for Biomedical Document Retrieval in Question Answering0
Using Company Specific Headlines and Convolutional Neural Networks to Predict Stock Fluctuations0
Using contextual and cross-lingual word embeddings to improve variety in template-based NLG for automated journalism0
Using Convolution Neural Network with BERT for Stance Detection in Vietnamese0
How Can BERT Help Lexical Semantics Tasks?0
Using Embedding Masks for Word Categorization0
Using Gaze Data to Predict Multiword Expressions0
Using k-way Co-occurrences for Learning Word Embeddings0
Using Large Pre-Trained Language Model to Assist FDA in Premarket Medical Device0
Using Linked Disambiguated Distributional Networks for Word Sense Disambiguation0
Using meaning instead of words to track topics0
Using Mined Coreference Chains as a Resource for a Semantic Task0
Using Neural Word Embeddings in the Analysis of the Clinical Semantic Verbal Fluency Task0
Show:102550
← PrevPage 105 of 161Next →

No leaderboard results yet.