SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 30113020 of 4002 papers

TitleStatusHype
Anaphora Resolution in Dialogue Systems for South Asian Languages0
An Artificial Language Evaluation of Distributional Semantic Models0
An Attentive Fine-Grained Entity Typing Model with Latent Type Representation0
An Automated Method to Enrich Consumer Health Vocabularies Using GloVe Word Embeddings and An Auxiliary Lexical Resource0
An Automatic Learning of an Algerian Dialect Lexicon by using Multilingual Word Embeddings0
Anchor-based Bilingual Word Embeddings for Low-Resource Languages0
An Efficient Cross-lingual Model for Sentence Classification Using Convolutional Neural Network0
An efficient domain-independent approach for supervised keyphrase extraction and ranking0
An Embedding Model for Predicting Roll-Call Votes0
An Empirical Analysis of NMT-Derived Interlingual Embeddings and their Use in Parallel Sentence Identification0
Show:102550
← PrevPage 302 of 401Next →

No leaderboard results yet.