SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 16611670 of 4002 papers

TitleStatusHype
Effect of Text Color on Word Embeddings0
A Hybrid Approach for Aspect-Based Sentiment Analysis Using Deep Contextual Word Embeddings and Hierarchical AttentionCode0
Too Many Claims to Fact-Check: Prioritizing Political Claims Based on Check-Worthiness0
Exploring the Combination of Contextual Word Embeddings and Knowledge Graph Embeddings0
Sentiment Analysis of Yelp Reviews: A Comparison of Techniques and ModelsCode0
Extending Text Informativeness Measures to Passage Interestingness Evaluation (Language Model vs. Word Embedding)0
Multi-Ontology Refined Embeddings (MORE): A Hybrid Multi-Ontology and Corpus-based Semantic Representation for Biomedical Concepts0
Punctuation Prediction in Spontaneous Conversations: Can We Mitigate ASR Errors with Retrofitted Word Embeddings?0
Improving Disfluency Detection by Self-Training a Self-Attentive Model0
Word Equations: Inherently Interpretable Sparse Word Embeddingsthrough Sparse Coding0
Show:102550
← PrevPage 167 of 401Next →

No leaderboard results yet.