SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 15311540 of 4002 papers

TitleStatusHype
Massive vs. Curated Embeddings for Low-Resourced Languages: the Case of Yor\`ub\'a and Twi0
All That Glitters is Not Gold: A Gold Standard of Adjective-Noun Collocations for German0
Evaluating the Impact of Sub-word Information and Cross-lingual Word Embeddings on Mi'kmaq Language Modelling0
Evaluating Sub-word Embeddings in Cross-lingual Models0
On the Influence of Coreference Resolution on Word Embeddings in Lexical-semantic Evaluation Tasks0
Estimating User Communication Styles for Spoken Dialogue Systems0
Towards a Semi-Automatic Detection of Reflexive and Reciprocal Constructions and Their Representation in a Valency Lexicon0
ENGLAWI: From Human- to Machine-Readable Wiktionary0
Identification of Indigenous Knowledge Concepts through Semantic Networks, Spelling Tools and Word Embeddings0
Abusive language in Spanish children and young teenager's conversations: data preparation and short text classification with contextual word embeddings0
Show:102550
← PrevPage 154 of 401Next →

No leaderboard results yet.