SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 32313240 of 4002 papers

TitleStatusHype
BERT-Based Neural Collaborative Filtering and Fixed-Length Contiguous Tokens Explanation0
BERT-based Ranking for Biomedical Entity Normalization0
BERTMap: A BERT-based Ontology Alignment System0
BERTrade: Using Contextual Embeddings to Parse Old French0
BERT's Conceptual Cartography: Mapping the Landscapes of Meaning0
BERT Transformer model for Detecting Arabic GPT2 Auto-Generated Tweets0
Better Automatic Evaluation of Open-Domain Dialogue Systems with Contextualized Embeddings0
Better Early than Late: Fusing Topics with Word Embeddings for Neural Question Paraphrase Identification0
Better OOV Translation with Bilingual Terminology Mining0
Better Word Representations with Recursive Neural Networks for Morphology0
Show:102550
← PrevPage 324 of 401Next →

No leaderboard results yet.