SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 23512360 of 4002 papers

TitleStatusHype
Genre Separation Network with Adversarial Training for Cross-genre Relation Extraction0
Geographical Evaluation of Word Embeddings0
Geographically-Balanced Gigaword Corpora for 50 Language Varieties0
Geometry-aware Domain Adaptation for Unsupervised Alignment of Word Embeddings0
Getting the \#\#life out of living: How Adequate Are Word-Pieces for Modelling Complex Morphology?0
GHH at SemEval-2018 Task 10: Discovering Discriminative Attributes in Distributional Semantics0
GiBERT: Introducing Linguistic Knowledge into BERT through a Lightweight Gated Injection Method0
Give It a Shot: Few-shot Learning to Normalize ADR Mentions in Social Media Posts0
GL at SemEval-2019 Task 5: Identifying hateful tweets with a deep learning approach.0
GlobalTrait: Personality Alignment of Multilingual Word Embeddings0
Show:102550
← PrevPage 236 of 401Next →

No leaderboard results yet.