SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 36113620 of 4002 papers

TitleStatusHype
Detecting Most Frequent Sense using Word Embeddings and BabelNet0
Detecting Paraphrases of Standard Clause Titles in Insurance Contracts0
Detecting Policy Preferences and Dynamics in the UN General Debate with Neural Word Embeddings0
Detecting Requirements Smells With Deep Learning: Experiences, Challenges and Future Work0
Detecting Sarcasm Using Different Forms Of Incongruity0
Detecting Semantically Equivalent Questions in Online User Forums0
Detecting Unassimilated Borrowings in Spanish: An Annotated Corpus and Approaches to Modeling0
Detecting Unseen Multiword Expressions in American Sign Language0
Detecting weak and strong Islamophobic hate speech on social media0
Detection of Adverse Drug Reaction in Tweets Using a Combination of Heterogeneous Word Embeddings0
Show:102550
← PrevPage 362 of 401Next →

No leaderboard results yet.