SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 28312840 of 4002 papers

TitleStatusHype
Text2Node: a Cross-Domain System for Mapping Arbitrary Phrases to a Taxonomy0
Self-supervised audio representation learning for mobile devices0
Emotional Embeddings: Refining Word Embeddings to Capture Emotional Content of Words0
Examining Structure of Word Embeddings with PCA0
Regularization Advantages of Multilingual Neural Language Models for Low Resource Domains0
On the Robustness of Unsupervised and Semi-supervised Cross-lingual Word Embedding Learning0
Disentangling Latent Emotions of Word Embeddings on Complex Emotional Narratives0
Parsimonious Morpheme Segmentation with an Application to Enriching Word Embeddings0
Low-Rank Approximation of Matrices for PMI-based Word Embeddings0
1Cademy at Semeval-2022 Task 1: Investigating the Effectiveness of Multilingual, Multitask, and Language-Agnostic Tricks for the Reverse Dictionary Task0
Show:102550
← PrevPage 284 of 401Next →

No leaderboard results yet.