SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 18611870 of 4002 papers

TitleStatusHype
Expanding Subjective Lexicons for Social Media Mining with Embedding Subspaces0
Clinical Named Entity Recognition using Contextualized Token Representations0
ArGoT: A Glossary of Terms extracted from the arXiv0
ExB Themis: Extensive Feature Extraction from Word Alignments for Semantic Textual Similarity0
INAOE-UPV at SemEval-2018 Task 3: An Ensemble Approach for Irony Detection in Twitter0
Including Semantic Information via Word Embeddings for Skeleton-based Action Recognition0
Example-based Acquisition of Fine-grained Collocation Resources0
Incorporating Context into Language Encoding Models for fMRI0
Clinical Event Detection with Hybrid Neural Architecture0
Examining European Press Coverage of the Covid-19 No-Vax Movement: An NLP Framework0
Show:102550
← PrevPage 187 of 401Next →

No leaderboard results yet.