SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 20812090 of 4002 papers

TitleStatusHype
Detecting Local Insights from Global Labels: Supervised & Zero-Shot Sequence Labeling via a Convolutional DecompositionCode0
Relational Word EmbeddingsCode0
Tracing Antisemitic Language Through Diachronic Embedding Projections: France 1789-1914Code0
Are Girls Neko or Shōjo? Cross-Lingual Alignment of Non-Isomorphic Embeddings with Iterative NormalizationCode0
Chinese Embedding via Stroke and Glyph Information: A Dual-channel View0
Gender-preserving Debiasing for Pre-trained Word EmbeddingsCode0
Global Textual Relation Embedding for Relational UnderstandingCode0
Zero-Shot Semantic SegmentationCode1
Contextually Propagated Term Weights for Document RepresentationCode0
An Analysis of Deep Contextual Word Embeddings and Neural Architectures for Toponym Mention Detection in Scientific Publications0
Show:102550
← PrevPage 209 of 401Next →

No leaderboard results yet.