SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 12311240 of 4002 papers

TitleStatusHype
CVBed: Structuring CVs usingWord Embeddings0
Automatically Linking Lexical Resources with Word Sense Embedding Models0
An Automatic Learning of an Algerian Dialect Lexicon by using Multilingual Word Embeddings0
Adversarial Representation Learning for Text-to-Image Matching0
Current Trends and Approaches in Synonyms Extraction: Potential Adaptation to Arabic0
Curatr: A Platform for Semantic Analysis and Curation of Historical Literary Texts0
Automatically Inferring Implicit Properties in Similes0
CU-NLP at SemEval-2016 Task 8: AMR Parsing using LSTM-based Recurrent Neural Networks0
Cultural Cartography with Word Embeddings0
Automatically Building a Multilingual Lexicon of False Friends With No Supervision0
Show:102550
← PrevPage 124 of 401Next →

No leaderboard results yet.