SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 37913800 of 4002 papers

TitleStatusHype
Profiling of Intertextuality in Latin Literature Using Word EmbeddingsCode0
Projective Methods for Mitigating Gender Bias in Pre-trained Language ModelsCode0
From Small to Large Language Models: Revisiting the Federalist PapersCode0
From Text to Lexicon: Bridging the Gap between Word Embeddings and Lexical ResourcesCode0
From the Paft to the Fiiture: a Fully Automatic NMT and Word Embeddings Method for OCR Post-CorrectionCode0
Characterizing Diseases from Unstructured Text: A Vocabulary Driven Word2vec ApproachCode0
ProMap: Effective Bilingual Lexicon Induction via Language Model PromptingCode0
Marked Attribute Bias in Natural Language InferenceCode0
Massively Multilingual Word EmbeddingsCode0
Frustratingly Easy Meta-Embedding -- Computing Meta-Embeddings by Averaging Source Word EmbeddingsCode0
Show:102550
← PrevPage 380 of 401Next →

No leaderboard results yet.