SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 16111620 of 4002 papers

TitleStatusHype
LMU Bilingual Dictionary Induction System with Word Surface Similarity Scores for BUCC 20200
Morphological Disambiguation of South S\'ami with FSTs and Neural Networks0
Evaluating the Impact of Sub-word Information and Cross-lingual Word Embeddings on Mi'kmaq Language Modelling0
Evaluating Sub-word Embeddings in Cross-lingual Models0
All That Glitters is Not Gold: A Gold Standard of Adjective-Noun Collocations for German0
Word Embedding Evaluation in Downstream Tasks and Semantic Analogies0
Towards Entity Spaces0
Estimating User Communication Styles for Spoken Dialogue Systems0
Identifying Cognates in English-Dutch and French-Dutch by means of Orthographic Information and Cross-lingual Word Embeddings0
Translating Knowledge Representations with Monolingual Word Embeddings: the Case of a Thesaurus on Corporate Non-Financial Reporting0
Show:102550
← PrevPage 162 of 401Next →

No leaderboard results yet.