SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 32913300 of 4002 papers

TitleStatusHype
End-to-end Recurrent Neural Network Models for Vietnamese Named Entity Recognition: Word-level vs. Character-levelCode0
Ontology-Aware Token Embeddings for Prepositional Phrase AttachmentCode0
Supervised Learning of Universal Sentence Representations from Natural Language Inference DataCode1
Senti17 at SemEval-2017 Task 4: Ten Convolutional Neural Network Voters for Tweet Polarity Classification0
On the effectiveness of feature set augmentation using clusters of word embeddings0
Finnish resources for evaluating language model semanticsCode0
Word vectors, reuse, and replicability: Towards a community repository of large-text resources0
The Making of the Royal Society Corpus0
Wordnet extension via word embeddings: Experiments on the Norwegian Wordnet0
Automatic Morpheme Segmentation and Labeling in Universal Dependencies ResourcesCode0
Show:102550
← PrevPage 330 of 401Next →

No leaderboard results yet.