SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 731740 of 4002 papers

TitleStatusHype
Alzheimer Disease Classification through ASR-based Transcriptions: Exploring the Impact of Punctuation and Pauses0
Clinical Named Entity Recognition using Contextualized Token Representations0
Clinical Text Classification with Rule-based Features and Knowledge-guided Convolutional Neural Networks0
ArGoT: A Glossary of Terms extracted from the arXiv0
CLULEX at SemEval-2021 Task 1: A Simple System Goes a Long Way0
Argumentative Topology: Finding Loop(holes) in Logic0
Clustering Comparable Corpora of Russian and Ukrainian Academic Texts: Word Embeddings and Semantic Fingerprints0
Clustering is Efficient for Approximate Maximum Inner Product Search0
Argument from Old Man’s View: Assessing Social Bias in Argumentation0
Blinov: Distributed Representations of Words for Aspect-Based Sentiment Analysis at SemEval 20140
Show:102550
← PrevPage 74 of 401Next →

No leaderboard results yet.