SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 22512275 of 4002 papers

TitleStatusHype
Single Training Dimension Selection for Word Embedding with PCA0
Sinhala Sentence Embedding: A Two-Tiered Structure for Low-Resource Languages0
Skip-Gram − Zipf + Uniform = Vector Additivity0
Skip-Thought GAN: Generating Text through Adversarial Training using Skip-Thought Vectors0
SMM4H Shared Task 2020 - A Hybrid Pipeline for Identifying Prescription Drug Abuse from Twitter: Machine Learning, Deep Learning, and Post-Processing0
Social Biases in Automatic Evaluation Metrics for NLG0
Social Image Tags as a Source of Word Embeddings: A Task-oriented Evaluation0
Social Media Text Processing and Semantic Analysis for Smart Cities0
Social Support Detection from Social Media Texts0
Social World Knowledge: Modeling and Applications0
SoftMatcha: A Soft and Fast Pattern Matcher for Billion-Scale Corpus Searches0
Softmax Bottleneck Makes Language Models Unable to Represent Multi-mode Word Distributions0
Softmax Bottleneck Makes Language Models Unable to Represent Multi-mode Word Distributions0
Solving Data Sparsity for Aspect Based Sentiment Analysis Using Cross-Linguality and Multi-Linguality0
SOS: Systematic Offensive Stereotyping Bias in Word Embeddings0
SOS: Systematic Offensive Stereotyping Bias in Word Embeddings0
Sound Analogies with Phoneme Embeddings0
SoundChoice: Grapheme-to-Phoneme Models with Semantic Disambiguation0
Sounds Wilde. Phonetically Extended Embeddings for Author-Stylized Poetry Generation0
Sound-Word2Vec: Learning Word Representations Grounded in Sounds0
Span-Aggregatable, Contextualized Word Embeddings for Effective Phrase Mining0
Span-based discontinuous constituency parsing: a family of exact chart-based algorithms with time complexities from O(n\^6) down to O(n\^3)0
Spanish Biomedical and Clinical Language Embeddings0
Spanish NER with Word Representations and Conditional Random Fields0
Sparse Coding of Neural Word Embeddings for Multilingual Sequence Labeling0
Show:102550
← PrevPage 91 of 161Next →

No leaderboard results yet.