SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 25112520 of 4002 papers

TitleStatusHype
Learning Gender-Neutral Word EmbeddingsCode0
Neural Cross-Lingual Named Entity Recognition with Minimal ResourcesCode0
A Quantum Many-body Wave Function Inspired Language Modeling ApproachCode0
WiC: the Word-in-Context Dataset for Evaluating Context-Sensitive Meaning Representations0
Card-660: Cambridge Rare Word Dataset - a Reliable Benchmark for Infrequent Word Representation Models0
Adapting Word Embeddings to New Languages with Morphological and Phonological Subword RepresentationsCode0
An Investigation of the Interactions Between Pre-Trained Word Embeddings, Character Models and POS Tags in Dependency Parsing0
Predefined Sparseness in Recurrent Sequence ModelsCode0
Generating Text through Adversarial Training using Skip-Thought VectorsCode0
Unsupervised Multilingual Word EmbeddingsCode0
Show:102550
← PrevPage 252 of 401Next →

No leaderboard results yet.