SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 861870 of 4002 papers

TitleStatusHype
Rethinking Stealthiness of Backdoor Attack against NLP ModelsCode1
Modeling Text using the Continuous Space Topic Model with Pre-Trained Word Embeddings0
RAW-C: Relatedness of Ambiguous Words in Context (A New Lexical Resource for English)0
UMUTeam at SemEval-2021 Task 7: Detecting and Rating Humor and Offense with Linguistic Features and Word EmbeddingsCode0
Beyond Offline Mapping: Learning Cross-lingual Word Embeddings through Context Anchoring0
PolyU CBS-Comp at SemEval-2021 Task 1: Lexical Complexity Prediction (LCP)0
RS\_GV at SemEval-2021 Task 1: Sense Relative Lexical Complexity Prediction0
Evaluating a Joint Training Approach for Learning Cross-lingual Embeddings with Sub-word Information without Parallel Corpora on Lower-resource Languages0
Compound or Term Features? Analyzing Salience in Predicting the Difficulty of German Noun Compounds across Domains0
Learning Embeddings for Rare Words Leveraging Internet Search Engine and Spatial Location Relationships0
Show:102550
← PrevPage 87 of 401Next →

No leaderboard results yet.