SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 711720 of 4002 papers

TitleStatusHype
Deep Image-to-Recipe TranslationCode0
Deep learning for language understanding of mental health concepts derived from Cognitive Behavioural TherapyCode0
Bigrams and BiLSTMs Two Neural Networks for Sequential Metaphor DetectionCode0
GLoMo: Unsupervisedly Learned Relational Graphs as Transferable RepresentationsCode0
Grammatical gender associations outweigh topical gender bias in crosslinguistic word embeddingsCode0
Classification and Clustering of Arguments with Contextualized Word EmbeddingsCode0
Abolitionist Networks: Modeling Language Change in Nineteenth-Century Activist NewspapersCode0
DeepHateExplainer: Explainable Hate Speech Detection in Under-resourced Bengali LanguageCode0
Deep Pivot-Based Modeling for Cross-language Cross-domain Transfer with Minimal GuidanceCode0
Design and Implementation of a Quantum Kernel for Natural Language ProcessingCode0
Show:102550
← PrevPage 72 of 401Next →

No leaderboard results yet.