SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 14311440 of 4002 papers

TitleStatusHype
Conceptor Debiasing of Word Representations Evaluated on WEAT0
A Study of Neural Matching Models for Cross-lingual IR0
A Study of Cross-Lingual Ability and Language-specific Information in Multilingual BERT0
A Multi-tiered Solution for Personalized Baggage Item Recommendations using FastText and Association Rule Mining0
Computationally Constructed Concepts: A Machine Learning Approach to Metaphor Interpretation Using Usage-Based Construction Grammatical Cues0
Computational Detection of Intertextual Parallels in Biblical Hebrew: A Benchmark Study Using Transformer-Based Language Models0
A Structured Distributional Semantic Model : Integrating Structure with Semantics0
Compression of Generative Pre-trained Language Models via Quantization0
A Structured Distributional Semantic Model for Event Co-reference0
A Multitask Objective to Inject Lexical Contrast into Distributional Semantics0
Show:102550
← PrevPage 144 of 401Next →

No leaderboard results yet.