SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 36613670 of 4002 papers

TitleStatusHype
Aligning Multilingual Word Embeddings for Cross-Modal Retrieval TaskCode0
A Neural Language Model for Dynamically Representing the Meanings of Unknown Words and Entities in a DiscourseCode0
The Dynamic Embedded Topic ModelCode0
Exploration of register-dependent lexical semantics using word embeddingsCode0
Political Stance in DanishCode0
The Early Modern Dutch Mediascape. Detecting Media Mentions in Chronicles Using Word Embeddings and CRFCode0
ViCE: Improving Dense Representation Learning by Superpixelization and Contrasting Cluster AssignmentCode0
A Simple and Effective Approach for Fine Tuning Pre-trained Word Embeddings for Improved Text ClassificationCode0
ViCo: Word Embeddings from Visual Co-occurrencesCode0
Exploring Diachronic Lexical Semantics with JeSemECode0
Show:102550
← PrevPage 367 of 401Next →

No leaderboard results yet.