SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 32013225 of 4002 papers

TitleStatusHype
Automatic Detection of Incoherent Speech for Diagnosing Schizophrenia0
Automatic Generation of Multiple-Choice Questions0
Automatic Labeling of Problem-Solving Dialogues for Computational Microgenetic Learning Analytics0
Automatic Learning of Modality Exclusivity Norms with Crosslingual Word Embeddings0
Metaphor Interpretation Using Word Embeddings0
Automatic Noun Compound Interpretation using Deep Neural Networks and Word Embeddings0
Automatic Term Extraction from Newspaper Corpora: Making the Most of Specificity and Common Features0
Automatic Transformation of Clinical Narratives into Structured Format0
Automatic Triage of Mental Health Forum Posts0
Automatic Word Association Norms (AWAN)0
Automating Idea Unit Segmentation and Alignment for Assessing Reading Comprehension via Summary Protocol Analysis0
AWE: Asymmetric Word Embedding for Textual Entailment0
A Word Embedding Approach to Identifying Verb-Noun Idiomatic Combinations0
A Word Embedding Approach to Predicting the Compositionality of Multiword Expressions0
A Word-Embedding-based Sense Index for Regular Polysemy Representation0
AZMAT: Sentence Similarity Using Associative Matrices0
Massively Multilingual Lexical Specialization of Multilingual Transformers0
Backdoor Attacks in Federated Learning by Rare Embeddings and Gradient Ensembling0
Bad Form: Comparing Context-Based and Form-Based Few-Shot Learning in Distributional Semantic Models0
Bag-of-Vector Embeddings of Dependency Graphs for Semantic Induction0
Bag-of-Words Transfer: Non-Contextual Techniques for Multi-Task Learning0
BAHP: Benchmark of Assessing Word Embeddings in Historical Portuguese0
Balancing the composition of word embeddings across heterogenous data sets0
Batch IS NOT Heavy: Learning Word Representations From All Samples0
Bayesian Paragraph Vectors0
Show:102550
← PrevPage 129 of 161Next →

No leaderboard results yet.