SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 18111820 of 4002 papers

TitleStatusHype
Improved Neural Network-based Multi-label Classification with Better Initialization Leveraging Label Co-occurrence0
CopyBERT: A Unified Approach to Question Generation with Self-Attention0
Improved Semantic Representation for Domain-Specific Entities0
BERTrade: Using Contextual Embeddings to Parse Old French0
Improved Text Classification via Contrastive Adversarial Training0
Improved Word Embeddings with Implicit Structure Information0
Corpus specificity in LSA and Word2vec: the role of out-of-domain documents0
Analyzing Semantic Change in Japanese Loanwords0
Improve Lexicon-based Word Embeddings By Word Sense Disambiguation0
BERTMap: A BERT-based Ontology Alignment System0
Show:102550
← PrevPage 182 of 401Next →

No leaderboard results yet.