SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 27112720 of 4002 papers

TitleStatusHype
What company do words keep? Revisiting the distributional semantics of J.R. Firth & Zellig Harris0
What does Neural Bring? Analysing Improvements in Morphosyntactic Annotation and Lemmatisation of Slovenian, Croatian and Serbian0
What Does This Word Mean? Explaining Contextualized Embeddings with Natural Language Definition0
What do we need to know about an unknown word when parsing German0
What do you mean, BERT? Assessing BERT as a Distributional Semantics Model0
What makes multilingual BERT multilingual?0
What's in a Name? Reducing Bias in Bios without Access to Protected Attributes0
What's in an Embedding? Analyzing Word Embeddings through Multilingual Evaluation0
What's in Your Embedding, And How It Predicts Task Performance0
What the Vec? Towards Probabilistically Grounded Embeddings0
Show:102550
← PrevPage 272 of 401Next →

No leaderboard results yet.