SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 11111120 of 4002 papers

TitleStatusHype
Evaluation Of Word Embeddings From Large-Scale French Web ContentCode0
Discourse Relation Embeddings: Representing the Relations between Discourse Segments in Social MediaCode0
Large-scale Taxonomy Induction Using Entity and Word Embeddings0
GraphTMT: Unsupervised Graph-based Topic Modeling from Video TranscriptsCode0
Impact of Gender Debiased Word Embeddings in Language Modeling0
Learning to Lemmatize in the Word Representation Space0
Cross-lingual hate speech detection based on multilingual domain-specific word embeddings0
Mitigating Political Bias in Language Models Through Reinforced Calibration0
A Bi-Encoder LSTM Model For Learning Unstructured DialogsCode0
A Short Survey of Pre-trained Language Models for Conversational AI-A NewAge in NLP0
Show:102550
← PrevPage 112 of 401Next →

No leaderboard results yet.