SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 701710 of 4002 papers

TitleStatusHype
Looking Into the Black Box - How Are Idioms Processed in BERT?0
Pre-training and Fine-tuning Neural Topic Model: A Simple yet Effective Approach to Incorporating External Knowledge0
Improving Word Translation via Two-Stage Contrastive LearningCode1
First Bilingual Word Embeddings for te reo Māori and English: Towards Code-switching Detection in a Low-resourced setting0
Non-Linear Relational Information Probing in Word Embeddings0
Sentence Selection Strategies for Distilling Word Embeddings from BERT0
Vec2Node: Self-training with Tensor Augmentation for Text Classification with Few Labels0
Detecting Unassimilated Borrowings in Spanish: An Annotated Corpus and Approaches to Modeling0
FeelsGoodMan: Inferring Semantics of Twitch Neologisms0
Is Neural Topic Modelling Better than Clustering? An Empirical Study on Clustering with Contextual Embeddings for Topics0
Show:102550
← PrevPage 71 of 401Next →

No leaderboard results yet.