SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 11411150 of 4002 papers

TitleStatusHype
Deep Image-to-Recipe TranslationCode0
Incorporating Syntactic and Semantic Information in Word Embeddings using Graph Convolutional NetworksCode0
Discourse Relation Embeddings: Representing the Relations between Discourse Segments in Social MediaCode0
Global Textual Relation Embedding for Relational UnderstandingCode0
DeepHateExplainer: Explainable Hate Speech Detection in Under-resourced Bengali LanguageCode0
Harnessing Cross-lingual Features to Improve Cognate Detection for Low-resource LanguagesCode0
DisCoDisCo at the DISRPT2021 Shared Task: A System for Discourse Segmentation, Classification, and Connective DetectionCode0
Disentangling dialects: a neural approach to Indo-Aryan historical phonology and subgroupingCode0
Hindi/Bengali Sentiment Analysis Using Transfer Learning and Joint Dual Input Learning with Self AttentionCode0
Eliciting Explicit Knowledge From Domain Experts in Direct Intrinsic Evaluation of Word Embeddings for Specialized DomainsCode0
Show:102550
← PrevPage 115 of 401Next →

No leaderboard results yet.