SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 23912400 of 4002 papers

TitleStatusHype
Is there Gender bias and stereotype in Portuguese Word Embeddings?0
textTOvec: Deep Contextualized Neural Autoregressive Topic Models of Language with Distributed Compositional PriorCode0
Word Embeddings from Large-Scale Greek Web Content0
End-to-End Text Classification via Image-based Embedding using Character-level NetworksCode0
Understanding the Origins of Bias in Word EmbeddingsCode1
Text-based Sentiment Analysis and Music Emotion Recognition0
Italian Event Detection Goes Deep LearningCode0
Neural Networks for Cross-lingual Negation Scope Detection0
A Comparative Study of Neural Network Models for Sentence Classification0
A Deep Learning Architecture for De-identification of Patient Notes: Implementation and Evaluation0
Show:102550
← PrevPage 240 of 401Next →

No leaderboard results yet.