SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 16511660 of 4002 papers

TitleStatusHype
Exploring Wasserstein Distance across Concept Embeddings for Ontology Matching0
Generic and Specialized Word Embeddings for Multi-Domain Machine Translation0
Generic Embedding-Based Lexicons for Transparent and Reproducible Text Scoring0
Genre Separation Network with Adversarial Training for Cross-genre Relation Extraction0
Exploring Vector Spaces for Semantic Relations0
Geographically-Balanced Gigaword Corpora for 50 Language Varieties0
CogniFNN: A Fuzzy Neural Network Framework for Cognitive Word Embedding Evaluation0
Geometry-aware Domain Adaptation for Unsupervised Alignment of Word Embeddings0
Artificial mental phenomena: Psychophysics as a framework to detect perception biases in AI models0
A Mixture Model for Learning Multi-Sense Word Embeddings0
Show:102550
← PrevPage 166 of 401Next →

No leaderboard results yet.