SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 641650 of 4002 papers

TitleStatusHype
Learning Dynamic Contextualised Word Embeddings via Template-based Temporal AdaptationCode0
Dialogue Term Extraction using Transfer Learning and Topological Data Analysis0
Lost in Context? On the Sense-wise Variance of Contextualized Word Embeddings0
Word-Embeddings Distinguish Denominal and Root-Derived Verbs in Semitic0
Assessing the Unitary RNN as an End-to-End Compositional Model of Syntax0
Where's the Learning in Representation Learning for Compositional Semantics and the Case of Thematic Fit0
Large scale analysis of gender bias and sexism in song lyrics0
Benchmarking zero-shot and few-shot approaches for tokenization, tagging, and dependency parsing of Tagalog text0
Gender bias in (non)-contextual clinical word embeddings for stereotypical medical categories0
Massively Multilingual Lexical Specialization of Multilingual Transformers0
Show:102550
← PrevPage 65 of 401Next →

No leaderboard results yet.