SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 221230 of 4002 papers

TitleStatusHype
Sentiment Word Aware Multimodal Refinement for Multimodal Sentiment Analysis with ASR ErrorsCode1
Shortformer: Better Language Modeling using Shorter InputsCode1
Classification Benchmarks for Under-resourced Bengali Language based on Multichannel Convolutional-LSTM NetworkCode1
Combining Static Word Embeddings and Contextual Representations for Bilingual Lexicon InductionCode1
Structured Pruning of Large Language ModelsCode1
Supervised Learning of Universal Sentence Representations from Natural Language Inference DataCode1
CODER: Knowledge infused cross-lingual medical term embedding for term normalizationCode1
"This is my unicorn, Fluffy": Personalizing frozen vision-language representationsCode1
Topic Modeling in Embedding SpacesCode1
Gender Bias in Contextualized Word EmbeddingsCode1
Show:102550
← PrevPage 23 of 401Next →

No leaderboard results yet.