SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 12311240 of 4002 papers

TitleStatusHype
DeepHateExplainer: Explainable Hate Speech Detection in Under-resourced Bengali LanguageCode0
WEmbSim: A Simple yet Effective Metric for Image Captioning0
Improved Biomedical Word Embeddings in the Transformer EraCode0
Model Choices Influence Attributive Word Associations: A Semi-supervised Analysis of Static Word Embeddings0
Intrinsic Image Captioning Evaluation0
A comparison of self-supervised speech representations as input features for unsupervised acoustic word embeddings0
Discriminative Pre-training for Low Resource Title Compression in Conversational Grocery0
TF-CR: Weighting Embeddings for Text ClassificationCode0
Improving Zero Shot Learning Baselines with Commonsense Knowledge0
Cross-lingual Word Sense Disambiguation using mBERT Embeddings with Syntactic Dependencies0
Show:102550
← PrevPage 124 of 401Next →

No leaderboard results yet.