SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 19611970 of 4002 papers

TitleStatusHype
Data Augmentation with Unsupervised Machine Translation Improves the Structural Similarity of Cross-lingual Word Embeddings0
Investigating the Stability of Concrete Nouns in Word Embeddings0
Dependency Parsing for Urdu: Resources, Conversions and Learning0
IRISA at SMM4H 2018: Neural Network and Bagging for Tweet Classification0
Data-Driven Mitigation of Adversarial Text Perturbation0
Is Neural Topic Modelling Better than Clustering? An Empirical Study on Clustering with Contextual Embeddings for Topics0
Dependency Link Embeddings: Continuous Representations of Syntactic Substructures0
BERT-based Ensembles for Modeling Disclosure and Support in Conversational Social Media Text0
Isomorphic Cross-lingual Embeddings for Low-Resource Languages0
Dependency-Based Word Embeddings0
Show:102550
← PrevPage 197 of 401Next →

No leaderboard results yet.