SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 28912900 of 4002 papers

TitleStatusHype
A Survey of Word Embeddings Evaluation MethodsCode0
A Resource-Light Method for Cross-Lingual Semantic Textual SimilarityCode0
WEAC: Word embeddings for anomaly classification from event logs0
Contextual and Position-Aware Factorization Machines for Sentiment Classification0
Biomedical Question Answering via Weighted Neural Network Passage Retrieval0
Recognition of Hyponymy and Meronymy Relations in Word Embeddings for Polish0
Wordnet-based Evaluation of Large Distributional Models for Polish0
ReferenceNet: a semantic-pragmatic network for capturing reference relations.0
Multilingual Wordnet sense Ranking using nearest context0
An Iterative Approach for Unsupervised Most Frequent Sense Detection using WordNet and Word Embeddings0
Show:102550
← PrevPage 290 of 401Next →

No leaderboard results yet.