SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 151160 of 4002 papers

TitleStatusHype
Embedding Words in Non-Vector Space with Unsupervised Graph LearningCode1
"Did you really mean what you said?" : Sarcasm Detection in Hindi-English Code-Mixed Data using Bilingual Word EmbeddingsCode1
BERT for Monolingual and Cross-Lingual Reverse DictionaryCode1
Multi-Relational Embedding for Knowledge Graph Representation and AnalysisCode1
iNLTK: Natural Language Toolkit for Indic LanguagesCode1
Modality-Transferable Emotion Embeddings for Low-Resource Multimodal Emotion RecognitionCode1
Latin BERT: A Contextual Language Model for Classical PhilologyCode1
Vector Projection Network for Few-shot Slot Tagging in Natural Language UnderstandingCode1
Dual-path CNN with Max Gated block for Text-Based Person Re-identificationCode1
Multilingual Music Genre Embeddings for Effective Cross-Lingual Music Item AnnotationCode1
Show:102550
← PrevPage 16 of 401Next →

No leaderboard results yet.