SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 39413950 of 4002 papers

TitleStatusHype
An Embedded Diachronic Sense Change Model with a Case Study from Ancient GreekCode0
Model Transfer for Tagging Low-resource Languages using a Bilingual DictionaryCode0
BULNER: BUg Localization with word embeddings and NEtwork RegularizationCode0
Building Sequential Inference Models for End-to-End Response SelectionCode0
A Neighbourhood-Aware Differential Privacy Mechanism for Static Word EmbeddingsCode0
SPINE: SParse Interpretable Neural EmbeddingsCode0
Will LLMs Replace the Encoder-Only Models in Temporal Relation Classification?Code0
MoNoise: Modeling Noise Using a Modular Normalization SystemCode0
Monolingual and Parallel Corpora for Kangri Low Resource LanguageCode0
Building End-To-End Dialogue Systems Using Generative Hierarchical Neural Network ModelsCode0
Show:102550
← PrevPage 395 of 401Next →

No leaderboard results yet.