SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 33513360 of 4002 papers

TitleStatusHype
Classification of Micro-Texts Using Sub-Word Embeddings0
Classifying Out-of-vocabulary Terms in a Domain-Specific Social Media Corpus0
Classifying Semantic Clause Types: Modeling Context and Genre Characteristics with Recurrent Neural Networks and Attention0
Classifying Text-Based Conspiracy Tweets related to COVID-19 using Contextualized Word Embeddings0
CLCL (Geneva) DINN Parser: a Neural Network Dependency Parser Ten Years Later0
CLFD: A Novel Vectorization Technique and Its Application in Fake News Detection0
Clickbait detection using word embeddings0
Clinical Abbreviation Disambiguation Using Neural Word Embeddings0
Clinical Event Detection with Hybrid Neural Architecture0
Clinical Named Entity Recognition using Contextualized Token Representations0
Show:102550
← PrevPage 336 of 401Next →

No leaderboard results yet.