SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 13211330 of 4002 papers

TitleStatusHype
A Two-Stage Approach for Computing Associative Responses to a Set of Stimulus Words0
Analyzing the Framing of 2020 Presidential Candidates in the News0
Country-level Arabic Dialect Identification Using Small Datasets with Integrated Machine Learning Techniques and Deep Learning Models0
Count-Based and Predictive Language Models for Exploring DeReKo0
A Twitter Corpus and Benchmark Resources for German Sentiment Analysis0
attr2vec: Jointly Learning Word and Contextual Attribute Embeddings with Factorization Machines0
A Dual Embedding Space Model for Document Ranking0
A Comparative Study of Transformers on Word Sense Disambiguation0
Correlation Analysis of Chronic Obstructive Pulmonary Disease (COPD) and its Biomarkers Using the Word Embeddings0
Correcting the Common Discourse Bias in Linear Representation of Sentences using Conceptors0
Show:102550
← PrevPage 133 of 401Next →

No leaderboard results yet.