SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 23912400 of 4002 papers

TitleStatusHype
The Chilean Waiting List Corpus: a new resource for clinical Named Entity Recognition in Spanish0
The Chinese Remainder Theorem for Compact, Task-Precise, Efficient and Secure Word Embeddings0
The CMU Submission for the Shared Task on Language Identification in Code-Switched Data0
The Corpus Replication Task0
The DCU Discourse Parser for Connective, Argument Identification and Explicit Sense Classification0
The DKU System Description for The Interspeech 2021 Auto-KWS Challenge0
The Effectiveness of Pre-Trained Code Embeddings0
The Effect of Pretraining on Extractive Summarization for Scientific Documents0
The effects of gender bias in word embeddings on depression prediction0
The emergent algebraic structure of RNNs and embeddings in NLP0
Show:102550
← PrevPage 240 of 401Next →

No leaderboard results yet.