SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 19211930 of 4002 papers

TitleStatusHype
Named Entity Recognition Only from Word Embeddings0
Open Named Entity Modeling from Embedding Distribution0
Single Training Dimension Selection for Word Embedding with PCA0
Encoders Help You Disambiguate Word Senses in Neural Machine Translation0
Adversarial Representation Learning for Text-to-Image Matching0
A survey of cross-lingual features for zero-shot cross-lingual semantic parsing0
A Morpho-Syntactically Informed LSTM-CRF Model for Named Entity Recognition0
BULNER: BUg Localization with word embeddings and NEtwork RegularizationCode0
Unsupervised Construction of Knowledge Graphs From Text and Code0
On Measuring and Mitigating Biased Inferences of Word EmbeddingsCode0
Show:102550
← PrevPage 193 of 401Next →

No leaderboard results yet.