SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 13411350 of 4002 papers

TitleStatusHype
QMUL-SDS @ DIACR-Ita: Evaluating Unsupervised Diachronic Lexical Semantics Classification in Italian0
Cross-lingual Word Embeddings beyond Zero-shot Machine Translation0
AraWEAT: Multidimensional Analysis of Biases in Arabic Word Embeddings0
DNN-Based Semantic Model for Rescoring N-best Speech Recognition List0
Unsupervised Cross-Lingual Part-of-Speech Tagging for Truly Low-Resource Scenarios0
Revisiting Representation Degeneration Problem in Language Modeling0
How does BERT capture semantics? A closer look at polysemous wordsCode0
Pre-tokenization of Multi-word Expressions in Cross-lingual Word Embeddings0
Understanding the Source of Semantic Regularities in Word Embeddings0
Cross-Lingual Suicidal-Oriented Word Embedding toward Suicide Prevention0
Show:102550
← PrevPage 135 of 401Next →

No leaderboard results yet.