SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 17811790 of 4002 papers

TitleStatusHype
Bootstrapping NLU Models with Multi-task Learning0
What do you mean, BERT? Assessing BERT as a Distributional Semantics Model0
Learning Relationships between Text, Audio, and Video via Deep Canonical Correlation for Multimodal Language Analysis0
word2ket: Space-efficient Word Embeddings inspired by Quantum EntanglementCode0
How to Evaluate Word Representations of Informal Domain?Code0
Learning Multi-Sense Word Distributions using Approximate Kullback-Leibler Divergence0
Contextualized End-to-End Neural Entity Linking0
Towards Understanding Gender Bias in Relation ExtractionCode0
Should All Cross-Lingual Embeddings Speak English?Code0
Ruminating Word Representations with Random Noised Masker0
Show:102550
← PrevPage 179 of 401Next →

No leaderboard results yet.