SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 33713380 of 4002 papers

TitleStatusHype
New word analogy corpus for exploring embeddings of Czech wordsCode0
Scoring Lexical Entailment with a Supervised Directional Similarity NetworkCode0
Knowing Where and What: Unified Word Block Pretraining for Document UnderstandingCode0
Knowledge-aware attentional neural network for review-based movie recommendation with explanationsCode0
niksss at HinglishEval: Language-agnostic BERT-based Contextual Embeddings with Catboost for Quality Evaluation of the Low-Resource Synthetically Generated Code-Mixed Hinglish TextCode0
Unsupervised Learning of Sentence Embeddings using Compositional n-Gram FeaturesCode0
Do Acoustic Word Embeddings Capture Phonological Similarity? An Empirical StudyCode0
Do CoNLL-2003 Named Entity Taggers Still Work Well in 2023?Code0
NILC-USP at SemEval-2017 Task 4: A Multi-view Ensemble for Twitter Sentiment AnalysisCode0
Document Embedding with Paragraph VectorsCode0
Show:102550
← PrevPage 338 of 401Next →

No leaderboard results yet.