SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 18711880 of 4002 papers

TitleStatusHype
Lost in Evaluation: Misleading Benchmarks for Bilingual Dictionary InductionCode0
Query Obfuscation Semantic Decomposition0
Hope Speech Detection: A Computational Analysis of the Voice of Peace0
Comprehensive Analysis of Aspect Term Extraction Methods using Various Text Embeddings0
Multimodal Embeddings from Language ModelsCode0
Definition Frames: Using Definitions for Hybrid Concept RepresentationsCode0
Follow the Leader: Documents on the Leading Edge of Semantic Change Get More CitationsCode0
Reverse Transfer Learning: Can Word Embeddings Trained for Different NLP Tasks Improve Neural Language Models?0
Composing Knowledge Graph Embeddings via Word Embeddings0
Distributed Training of Embeddings using Graph Analytics0
Show:102550
← PrevPage 188 of 401Next →

No leaderboard results yet.