SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 721730 of 4002 papers

TitleStatusHype
Classifying Out-of-vocabulary Terms in a Domain-Specific Social Media Corpus0
A Review on Deep Learning Techniques Applied to Answer Selection0
Classifying Semantic Clause Types: Modeling Context and Genre Characteristics with Recurrent Neural Networks and Attention0
Classifying Text-Based Conspiracy Tweets related to COVID-19 using Contextualized Word Embeddings0
CLCL (Geneva) DINN Parser: a Neural Network Dependency Parser Ten Years Later0
CLFD: A Novel Vectorization Technique and Its Application in Fake News Detection0
Clickbait detection using word embeddings0
Clinical Abbreviation Disambiguation Using Neural Word Embeddings0
“Are you calling for the vaporizer you ordered?” Combining Search and Prediction to Identify Orders in Contact Centers0
BLISS in Non-Isometric Embedding Spaces0
Show:102550
← PrevPage 73 of 401Next →

No leaderboard results yet.