SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 571580 of 4002 papers

TitleStatusHype
From Hyperbolic Geometry Back to Word EmbeddingsCode0
A Survey on Word Meta-Embedding Learning0
Emotion-Aware Transformer Encoder for Empathetic Dialogue GenerationCode1
Is Neural Topic Modelling Better than Clustering? An Empirical Study on Clustering with Contextual Embeddings for TopicsCode1
Towards Arabic Sentence Simplification via Classification and Generative ApproachesCode0
Unsupervised Numerical Reasoning to Extract Phenotypes from Clinical Text by Leveraging External Knowledge0
Multimodal Hate Speech Detection from Bengali Memes and TextsCode0
BLCU-ICALL at SemEval-2022 Task 1: Cross-Attention Multitasking Framework for Definition ModelingCode0
Word Embeddings Are Capable of Capturing Rhythmic Similarity of WordsCode0
Towards Better Chinese-centric Neural Machine Translation for Low-resource LanguagesCode1
Show:102550
← PrevPage 58 of 401Next →

No leaderboard results yet.