SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 25512560 of 4002 papers

TitleStatusHype
UniPi: Recognition of Mentions of Disorders in Clinical Text0
UNITN: Training Deep Convolutional Neural Network for Twitter Sentiment Classification0
Universal Joint Morph-Syntactic Processing: The Open University of Israel's Submission to The CoNLL 2017 Shared Task0
Unsupervised Abbreviation Disambiguation Contextual disambiguation using word embeddings0
Unsupervised Alignment of Distributional Word Embeddings0
Unsupervised Bilingual Lexicon Induction via Latent Variable Models0
Unsupervised Compositional Translation of Multiword Expressions0
Unsupervised Construction of Knowledge Graphs From Text and Code0
Unsupervised Cross-Lingual Part-of-Speech Tagging for Truly Low-Resource Scenarios0
Unsupervised Cross-lingual Word Embedding by Multilingual Neural Language Models0
Show:102550
← PrevPage 256 of 401Next →

No leaderboard results yet.