SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 871880 of 4002 papers

TitleStatusHype
Crossword: Estimating Unknown Embeddings using Cross Attention and Alignment Strategies0
First Bilingual Word Embeddings for te reo Māori and English: Towards Code-switching Detection in a Low-resourced setting0
On the interpretability and significance of bias metrics in texts: a PMI-based approach0
Discrete Wavelet Transform for Efficient Word Embeddings and Sentence Encoding0
Isomorphic Cross-lingual Embeddings for Low-Resource Languages0
Cross-lingual Word Embeddings in Hyperbolic Space0
Multi-Stage Framework with Refinement based Point Set Registration for Unsupervised Bi-Lingual Word Alignment0
Softmax Bottleneck Makes Language Models Unable to Represent Multi-mode Word Distributions0
Vec2Node: Self-training with Tensor Augmentation for Text Classification with Few Labels0
FeelsGoodMan: Inferring Semantics of Twitch Neologisms0
Show:102550
← PrevPage 88 of 401Next →

No leaderboard results yet.