SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 731740 of 4002 papers

TitleStatusHype
Clinical Flair: A Pre-Trained Language Model for Spanish Clinical Natural Language ProcessingCode0
Are you tough enough? Framework for Robustness Validation of Machine Comprehension SystemsCode0
Adapting Word Embeddings to New Languages with Morphological and Phonological Subword RepresentationsCode0
CLIP-Decoder : ZeroShot Multilabel Classification using Multimodal CLIP Aligned RepresentationCode0
Encoding Category Trees Into Word-Embeddings Using Geometric ApproachCode0
Clustering-Based Article Identification in Historical NewspapersCode0
Argument from Old Man's View: Assessing Social Bias in ArgumentationCode0
Deeper Text Understanding for IR with Contextual Neural Language ModelingCode0
Interpretable Segmentation of Medical Free-Text Records Based on Word EmbeddingsCode0
DeepEmo: Learning and Enriching Pattern-Based Emotion RepresentationsCode0
Show:102550
← PrevPage 74 of 401Next →

No leaderboard results yet.