SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 32113220 of 4002 papers

TitleStatusHype
Automating Idea Unit Segmentation and Alignment for Assessing Reading Comprehension via Summary Protocol Analysis0
AWE: Asymmetric Word Embedding for Textual Entailment0
A Word Embedding Approach to Identifying Verb-Noun Idiomatic Combinations0
A Word Embedding Approach to Predicting the Compositionality of Multiword Expressions0
A Word-Embedding-based Sense Index for Regular Polysemy Representation0
AZMAT: Sentence Similarity Using Associative Matrices0
Massively Multilingual Lexical Specialization of Multilingual Transformers0
Backdoor Attacks in Federated Learning by Rare Embeddings and Gradient Ensembling0
Bad Form: Comparing Context-Based and Form-Based Few-Shot Learning in Distributional Semantic Models0
Bag-of-Vector Embeddings of Dependency Graphs for Semantic Induction0
Show:102550
← PrevPage 322 of 401Next →

No leaderboard results yet.