SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 22512260 of 4002 papers

TitleStatusHype
Single Training Dimension Selection for Word Embedding with PCA0
Sinhala Sentence Embedding: A Two-Tiered Structure for Low-Resource Languages0
Skip-Gram − Zipf + Uniform = Vector Additivity0
Skip-Thought GAN: Generating Text through Adversarial Training using Skip-Thought Vectors0
SMM4H Shared Task 2020 - A Hybrid Pipeline for Identifying Prescription Drug Abuse from Twitter: Machine Learning, Deep Learning, and Post-Processing0
Social Biases in Automatic Evaluation Metrics for NLG0
Social Image Tags as a Source of Word Embeddings: A Task-oriented Evaluation0
Social Media Text Processing and Semantic Analysis for Smart Cities0
Social Support Detection from Social Media Texts0
Social World Knowledge: Modeling and Applications0
Show:102550
← PrevPage 226 of 401Next →

No leaderboard results yet.