SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 26512660 of 4002 papers

TitleStatusHype
Diachronic degradation of language models: Insights from social media0
Word Error Rate Estimation for Speech Recognition: e-WERCode1
Fully Statistical Neural Belief TrackingCode0
Towards Optimal Transport with Global Invariances0
Mapping Unparalleled Clinical Professional and Consumer Languages with Embedding Alignment0
The Corpus Replication Task0
Evaluation of sentence embeddings in downstream and linguistic probing tasksCode0
GLoMo: Unsupervisedly Learned Relational Graphs as Transferable RepresentationsCode0
fMRI Semantic Category Decoding using Linguistic Encoding of Word Embeddings0
Learning Acoustic Word Embeddings with Temporal Context for Query-by-Example Speech Search0
Show:102550
← PrevPage 266 of 401Next →

No leaderboard results yet.