SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 31913200 of 4002 papers

TitleStatusHype
Context-Aware Cross-Lingual MappingCode0
Augmenting Data with Mixup for Sentence Classification: An Empirical StudyCode0
Specializing Unsupervised Pretraining Models for Word-Level Semantic SimilarityCode0
Multi-Relational Hyperbolic Word Embeddings from Natural Language DefinitionsCode0
MultiSeg: Parallel Data and Subword Information for Learning Bilingual Embeddings in Low Resource ScenariosCode0
Subword-based Compact Reconstruction of Word EmbeddingsCode0
Deep convolutional acoustic word embeddings using word-pair side informationCode0
Watset: Automatic Induction of Synsets from a Graph of SynonymsCode0
Multi-sense embeddings through a word sense disambiguation processCode0
DeepEmo: Learning and Enriching Pattern-Based Emotion RepresentationsCode0
Show:102550
← PrevPage 320 of 401Next →

No leaderboard results yet.