SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 17911800 of 4002 papers

TitleStatusHype
Interactive Refinement of Cross-Lingual Word EmbeddingsCode0
Should All Cross-Lingual Embeddings Speak English?Code0
How Can BERT Help Lexical Semantics Tasks?0
Invariance and identifiability issues for word embeddings0
A Deep Learning approach for Hindi Named Entity Recognition0
Incremental Sense Weight Training for the Interpretation of Contextualized Word Embeddings0
Integrating Dictionary Feature into A Deep Learning Model for Disease Named Entity Recognition0
Assessing Social and Intersectional Biases in Contextualized Word RepresentationsCode0
Emerging Cross-lingual Structure in Pretrained Language Models0
Deep Contextualized Word Embeddings in Transition-Based and Graph-Based Dependency Parsing - A Tale of Two Parsers Revisited0
Show:102550
← PrevPage 180 of 401Next →

No leaderboard results yet.