SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 13311340 of 4002 papers

TitleStatusHype
Development of Word Embeddings for Uzbek Language0
BERT for Monolingual and Cross-Lingual Reverse DictionaryCode1
Leader: Prefixing a Length for Faster Word Vector SerializationCode0
Multi-Relational Embedding for Knowledge Graph Representation and AnalysisCode1
Metaphor Detection using Deep Contextualized Word Embeddings0
iNLTK: Natural Language Toolkit for Indic LanguagesCode1
CogniFNN: A Fuzzy Neural Network Framework for Cognitive Word Embedding Evaluation0
Visual-Semantic Embedding Model Informed by Structured Knowledge0
Vector Projection Network for Few-shot Slot Tagging in Natural Language UnderstandingCode1
Modality-Transferable Emotion Embeddings for Low-Resource Multimodal Emotion RecognitionCode1
Show:102550
← PrevPage 134 of 401Next →

No leaderboard results yet.