SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 17911800 of 4002 papers

TitleStatusHype
BioReddit: Word Embeddings for User-Generated Biomedical NLP0
Joint Semantic and Distributional Word Representations with Multi-Graph Embeddings0
Bag-of-Words Transfer: Non-Contextual Techniques for Multi-Task Learning0
Neural Cross-Lingual Relation Extraction Based on Bilingual Word Embedding Mapping0
Probabilistic Bias Mitigation in Word Embeddings0
How does Grammatical Gender Affect Noun Representations in Gender-Marking Languages?Code0
LSTM Easy-first Dependency Parsing with Pre-trained Word Embeddings and Character-level Word Embeddings in Vietnamese0
Detect Toxic Content to Improve Online Conversations0
Word-level Textual Adversarial Attacking as Combinatorial OptimizationCode0
Latent Suicide Risk Detection on Microblog via Suicide-Oriented Word Embeddings and Layered Attention0
Show:102550
← PrevPage 180 of 401Next →

No leaderboard results yet.