SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 12611270 of 4002 papers

TitleStatusHype
A Framework for Understanding the Role of Morphology in Universal Dependency Parsing0
Employing Word Representations and Regularization for Domain Adaptation of Relation Extraction0
Effect of Text Color on Word Embeddings0
Effect of Text Processing Steps on Twitter Sentiment Classification using Word Embedding0
Effects of Creativity and Cluster Tightness on Short Text Clustering Performance0
Effects of Word Embeddings on Neural Network-based Pitch Accent Detection0
``A Passage to India'': Pre-trained Word Embeddings for Indian Languages0
Determining Gains Acquired from Word Embedding Quantitatively Using Discrete Distribution Clustering0
A Cross-lingual Natural Language Processing Framework for Infodemic Management0
Determining Code Words in Euphemistic Hate Speech Using Word Embedding Networks0
Show:102550
← PrevPage 127 of 401Next →

No leaderboard results yet.