SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 14311440 of 4002 papers

TitleStatusHype
Development of Word Embeddings for Uzbek Language0
Leader: Prefixing a Length for Faster Word Vector SerializationCode0
Metaphor Detection using Deep Contextualized Word Embeddings0
CogniFNN: A Fuzzy Neural Network Framework for Cognitive Word Embedding Evaluation0
Visual-Semantic Embedding Model Informed by Structured Knowledge0
Exploring the Linear Subspace Hypothesis in Gender Bias MitigationCode0
Word class flexibility: A deep contextualized approachCode0
An Interpretable and Uncertainty Aware Multi-Task Framework for Multi-Aspect Sentiment AnalysisCode0
More Embeddings, Better Sequence Labelers?0
Compositional and Lexical Semantics in RoBERTa, BERT and DistilBERT: A Case Study on CoQA0
Show:102550
← PrevPage 144 of 401Next →

No leaderboard results yet.