SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 31013110 of 4002 papers

TitleStatusHype
A Retrofitting Model for Incorporating Semantic Relations into Word Embeddings0
A Review of Cross-Domain Text-to-SQL Models0
A Review of Standard Text Classification Practices for Multi-label Toxicity Identification of Online Content0
A Review on Deep Learning Techniques Applied to Answer Selection0
Are Word Embedding-based Features Useful for Sarcasm Detection?0
Are Word Embeddings Really a Bad Fit for the Estimation of Thematic Fit?0
“Are you calling for the vaporizer you ordered?” Combining Search and Prediction to Identify Orders in Contact Centers0
ArGoT: A Glossary of Terms extracted from the arXiv0
Argumentative Topology: Finding Loop(holes) in Logic0
Argument from Old Man’s View: Assessing Social Bias in Argumentation0
Show:102550
← PrevPage 311 of 401Next →

No leaderboard results yet.