SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 35913600 of 4002 papers

TitleStatusHype
Dependency Based Embeddings for Sentence Classification Tasks0
Dependency-Based Semantic Role Labeling using Convolutional Neural Networks0
Dependency-Based Word Embeddings0
Dependency Link Embeddings: Continuous Representations of Syntactic Substructures0
Dependency Parsing for Urdu: Resources, Conversions and Learning0
Explainable Depression Detection with Multi-Modalities Using a Hybrid Deep Learning Model on Social Media0
Derivational Morphological Relations in Word Embeddings0
Derivational Morphological Relations in Word Embeddings0
Deriving Contextualised Semantic Features from BERT (and Other Transformer Model) Embeddings0
Deriving continous grounded meaning representations from referentially structured multimodal contexts0
Show:102550
← PrevPage 360 of 401Next →

No leaderboard results yet.