SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 37713780 of 4002 papers

TitleStatusHype
Probing for Semantic Classes: Diagnosing the Meaning Content of Word EmbeddingsCode0
Machine Translation with Cross-lingual Word EmbeddingsCode0
Follow the Leader: Documents on the Leading Edge of Semantic Change Get More CitationsCode0
Analyzing Structures in the Semantic Vector Space: A Framework for Decomposing Word EmbeddingsCode0
FRAGE: Frequency-Agnostic Word RepresentationCode0
Frame- and Entity-Based Knowledge for Common-Sense Argumentative ReasoningCode0
Word Embeddings Quantify 100 Years of Gender and Ethnic StereotypesCode0
Characterizing Linguistic Shifts in Croatian News via Diachronic Word EmbeddingsCode0
Magnitude: A Fast, Efficient Universal Vector Embedding Utility PackageCode0
A Graph Degeneracy-based Approach to Keyword ExtractionCode0
Show:102550
← PrevPage 378 of 401Next →

No leaderboard results yet.