SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 35213530 of 4002 papers

TitleStatusHype
Enhanced word embeddings using multi-semantic representation through lexical chainsCode0
ML-EAT: A Multilevel Embedding Association Test for Interpretable and Transparent Social ScienceCode0
Learning Gender-Neutral Word EmbeddingsCode0
Enhancing biomedical word embeddings by retrofitting to verb clustersCode0
Semantics or spelling? Probing contextual word embeddings with orthographic noiseCode0
On the Dimensionality of Word EmbeddingCode0
Enhancing Deep Learning with Embedded Features for Arabic Named Entity RecognitionCode0
On the Downstream Performance of Compressed Word EmbeddingsCode0
Classifying Relations by Ranking with Convolutional Neural NetworksCode0
Learning language through picturesCode0
Show:102550
← PrevPage 353 of 401Next →

No leaderboard results yet.