SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 10011010 of 4002 papers

TitleStatusHype
Eliciting Explicit Knowledge From Domain Experts in Direct Intrinsic Evaluation of Word Embeddings for Specialized DomainsCode0
Country-level Arabic Dialect Identification Using Small Datasets with Integrated Machine Learning Techniques and Deep Learning Models0
Multi-task Learning Using a Combination of Contextualised and Static Word Embeddings for Arabic Sarcasm Detection and Sentiment Analysis0
Using contextual and cross-lingual word embeddings to improve variety in template-based NLG for automated journalism0
Cross-Lingual Transfer Learning for Hate Speech Detection0
Handling Out-Of-Vocabulary Problem in Hangeul Word Embeddings0
RelWalk - A Latent Variable Model Approach to Knowledge Graph Embedding0
The Chinese Remainder Theorem for Compact, Task-Precise, Efficient and Secure Word Embeddings0
Exploiting Position and Contextual Word Embeddings for Keyphrase Extraction from Scientific Papers0
Clustering Word Embeddings with Self-Organizing Maps. Application on LaRoSeDa - A Large Romanian Sentiment Data Set0
Show:102550
← PrevPage 101 of 401Next →

No leaderboard results yet.