SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 441450 of 4002 papers

TitleStatusHype
A methodology to characterize bias and harmful stereotypes in natural language processing in Latin AmericaCode0
Empower Sequence Labeling with Task-Aware Neural Language ModelCode0
Analytical Methods for Interpretable Ultradense Word EmbeddingsCode0
End-to-End Neural Ad-hoc Ranking with Kernel PoolingCode0
End-to-End Text Classification via Image-based Embedding using Character-level NetworksCode0
Enhanced word embeddings using multi-semantic representation through lexical chainsCode0
Cross-Lingual BERT Transformation for Zero-Shot Dependency ParsingCode0
Cross-lingual Models of Word Embeddings: An Empirical ComparisonCode0
Analyzing Continuous Semantic Shifts with Diachronic Word Similarity MatricesCode0
A Resource-Light Method for Cross-Lingual Semantic Textual SimilarityCode0
Show:102550
← PrevPage 45 of 401Next →

No leaderboard results yet.