SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 241250 of 4002 papers

TitleStatusHype
Optimal Hyperparameters for Deep LSTM-Networks for Sequence Labeling TasksCode1
Supervised Learning of Universal Sentence Representations from Natural Language Inference DataCode1
Multimodal Word DistributionsCode1
FastText.zip: Compressing text classification modelsCode1
Learning principled bilingual mappings of word embeddings while preserving monolingual invarianceCode1
Adversarial Training Methods for Semi-Supervised Text ClassificationCode1
From Word Embeddings to Item RecommendationCode1
Short Text Clustering via Convolutional Neural NetworksCode1
Speak2Sign3D: A Multi-modal Pipeline for English Speech to American Sign Language Animation0
Computational Detection of Intertextual Parallels in Biblical Hebrew: A Benchmark Study Using Transformer-Based Language Models0
Show:102550
← PrevPage 25 of 401Next →

No leaderboard results yet.