SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 20312040 of 4002 papers

TitleStatusHype
DNN-Based Semantic Model for Rescoring N-best Speech Recognition List0
Document Embedding for Scientific Articles: Efficacy of Word Embeddings vs TFIDF0
Document-Level Machine Translation with Word Vector Models0
Document-Level Sentiment Analysis of Urdu Text Using Deep Learning Techniques0
Do Deep Learning Models and News Headlines Outperform Conventional Prediction Techniques on Forex Data?0
Does History Matter? Using Narrative Context to Predict the Trajectory of Sentence Sentiment0
Does the Geometry of Word Embeddings Help Document Classification? A Case Study on Persistent Homology-Based Representations0
Does the Geometry of Word Embeddings Help Document Classification? A Case Study on Persistent Homology Based Representations0
Do gender neutral affixes naturally reduce gender bias in static word embeddings?0
Domain adaptation challenges of BERT in tokenization and sub-word representations of Out-of-Vocabulary words0
Show:102550
← PrevPage 204 of 401Next →

No leaderboard results yet.