SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 961970 of 4002 papers

TitleStatusHype
DisCoDisCo at the DISRPT2021 Shared Task: A System for Discourse Segmentation, Classification, and Connective DetectionCode0
Augmenting semantic lexicons using word embeddings and transfer learningCode0
Fast query-by-example speech search using separable model0
Contrastive Word Embedding Learning for Neural Machine Translation0
Gender Roles from Word Embeddings in a Century of Children’s Books0
Task-adaptive Pre-training of Language Models with Word Embedding Regularization0
Revisiting Tri-training of Dependency ParsersCode0
Comparing Feature-Engineering and Feature-Learning Approaches for Multilingual Translationese Classification0
Evaluating Biomedical BERT Models for Vocabulary Alignment at Scale in the UMLS Metathesaurus0
InceptionXML: A Lightweight Framework with Synchronized Negative Sampling for Short Text Extreme ClassificationCode0
Show:102550
← PrevPage 97 of 401Next →

No leaderboard results yet.