SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 30013010 of 4002 papers

TitleStatusHype
Analyzing Correlations Between Intrinsic and Extrinsic Bias Metrics of Static Word Embeddings With Their Measuring Biases Aligned0
Analyzing Semantic Change in Japanese Loanwords0
Analyzing the Framing of 2020 Presidential Candidates in the News0
Analyzing the Limitations of Cross-lingual Word Embedding Mappings0
Analyzing the Representational Geometry of Acoustic Word Embeddings0
Analyzing Word Embedding Through Structural Equation Modeling0
An Analysis of Deep Contextual Word Embeddings and Neural Architectures for Toponym Mention Detection in Scientific Publications0
An Analysis of Embedding Layers and Similarity Scores using Siamese Neural Networks0
An Analysis of Hierarchical Text Classification Using Word Embeddings0
An analysis of the user occupational class through Twitter content0
Show:102550
← PrevPage 301 of 401Next →

No leaderboard results yet.