SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 16011610 of 4002 papers

TitleStatusHype
Extrapolating Binder Style Word Embeddings to New Words0
Exploring Bilingual Word Embeddings for Hiligaynon, a Low-Resource Language0
Graph Exploration and Cross-lingual Word Embeddings for Translation Inference Across Dictionaries0
Are Word Embeddings Really a Bad Fit for the Estimation of Thematic Fit?0
CLFD: A Novel Vectorization Technique and Its Application in Fake News Detection0
Time-Aware Word Embeddings for Three Lebanese News ArchivesCode0
Evaluating Word Embeddings for Indonesian--English Code-Mixed Text Based on Synthetic Data0
Evaluating the Impact of Sub-word Information and Cross-lingual Word Embeddings on Mi'kmaq Language Modelling0
Evaluating Sub-word Embeddings in Cross-lingual Models0
All That Glitters is Not Gold: A Gold Standard of Adjective-Noun Collocations for German0
Show:102550
← PrevPage 161 of 401Next →

No leaderboard results yet.