SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 16811690 of 4002 papers

TitleStatusHype
Graph-based Nearest Neighbor Search in Hyperbolic Spaces0
Graph Based Semi-Supervised Learning Approach for Tamil POS tagging0
Graph-based Syntactic Word Embeddings0
An evaluation of Czech word embeddings0
HIT-SCIR at MRP 2019: A Unified Pipeline for Meaning Representation Parsing via Efficient Training and Effective Encoding0
Hostility Detection and Covid-19 Fake News Detection in Social Media0
How Much Does Tokenization Affect Neural Machine Translation?0
Grapheme-level Awareness in Word Embeddings for Morphologically Rich Languages0
Graph Exploration and Cross-lingual Word Embeddings for Translation Inference Across Dictionaries0
Hybrid Improved Document-level Embedding (HIDE)0
Show:102550
← PrevPage 169 of 401Next →

No leaderboard results yet.