SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 23612370 of 4002 papers

TitleStatusHype
Normalization of Transliterated Words in Code-Mixed Data Using Seq2Seq Model \& Levenshtein Distance0
Multilingual Embeddings Jointly Induced from Contexts and Concepts: Simple, Strong and Scalable0
GlobalTrait: Personality Alignment of Multilingual Word Embeddings0
Truly unsupervised acoustic word embeddings using weak top-down constraints in encoder-decoder modelsCode0
Learning Unsupervised Word Mapping by Maximizing Mean Discrepancy0
SIEVE: Helping Developers Sift Wheat from Chaff via Cross-Platform Analysis0
Measuring Issue Ownership using Word Embeddings0
Aligning Very Small Parallel Corpora Using Cross-Lingual Word Embeddings and a Monogamy Objective0
Attentive Neural Network for Named Entity Recognition in VietnameseCode0
Word Mover's Embedding: From Word2Vec to Document EmbeddingCode0
Show:102550
← PrevPage 237 of 401Next →

No leaderboard results yet.