SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 30013010 of 4002 papers

TitleStatusHype
Building a Web-Scale Dependency-Parsed Corpus from CommonCrawl0
Improving Lexical Choice in Neural Machine TranslationCode0
Evaluating Word Embeddings for Sentence Boundary Detection in Speech Transcripts0
Bag-of-Vector Embeddings of Dependency Graphs for Semantic Induction0
Synonym Discovery with Etymology-based Word Embeddings0
Volatility Prediction using Financial Disclosures Sentiments with Word Embedding-based IR Models0
Structured Embedding Models for Grouped DataCode0
KeyVec: Key-semantics Preserving Document Representations0
Application of a Hybrid Bi-LSTM-CRF model to the task of Russian Named Entity RecognitionCode0
Learning of Colors from Color Names: Distribution and Point EstimationCode0
Show:102550
← PrevPage 301 of 401Next →

No leaderboard results yet.