SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 20312040 of 4002 papers

TitleStatusHype
Word2Sense: Sparse Interpretable Word Embeddings0
Unsupervised Joint Training of Bilingual Word Embeddings0
Putting Evaluation in Context: Contextual Embeddings Improve Machine Translation EvaluationCode0
Embedding Strategies for Specialized Domains: Application to Clinical Entity RecognitionCode0
Are Girls Neko or Sh\=ojo? Cross-Lingual Alignment of Non-Isomorphic Embeddings with Iterative Normalization0
Unsupervised Multilingual Word Embedding with Limited Resources using Neural Language ModelsCode1
ARHNet - Leveraging Community Interaction for Detection of Religious Hate Speech in Arabic0
Neural Temporality Adaptation for Document Classification: Diachronic Word Embeddings and Domain Adaptation ModelsCode0
Robust to Noise Models in Natural Language Processing TasksCode0
Reliability-aware Dynamic Feature Composition for Name TaggingCode0
Show:102550
← PrevPage 204 of 401Next →

No leaderboard results yet.