SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 23812390 of 4002 papers

TitleStatusHype
Proactive Security: Embedded AI Solution for Violent and Abusive Speech Recognition0
Named Entity Recognition on Twitter for Turkish using Semi-supervised Learning with Word Embeddings0
pair2vec: Compositional Word-Pair Embeddings for Cross-Sentence InferenceCode1
pioNER: Datasets and Baselines for Armenian Named Entity RecognitionCode0
Analysis of Railway Accidents' Narratives Using Deep LearningCode0
Skip-Thought GAN: Generating Text through Adversarial Training using Skip-Thought Vectors0
Subword Semantic Hashing for Intent Classification on Small DatasetsCode0
TNE: A Latent Model for Representation Learning on Networks0
Poincaré GloVe: Hyperbolic Word EmbeddingsCode0
A Multimodal Approach towards Emotion Recognition of Music using Audio and Lyrical Content0
Show:102550
← PrevPage 239 of 401Next →

No leaderboard results yet.