SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 26212630 of 4002 papers

TitleStatusHype
Using Large Pre-Trained Language Model to Assist FDA in Premarket Medical Device0
Using Linked Disambiguated Distributional Networks for Word Sense Disambiguation0
Using meaning instead of words to track topics0
Using Mined Coreference Chains as a Resource for a Semantic Task0
Using Neural Word Embeddings in the Analysis of the Clinical Semantic Verbal Fluency Task0
Using pseudo-senses for improving the extraction of synonyms from word embeddings0
Using reading behavior to predict grammatical functions0
Using Sentences as Semantic Representations in Large Scale Zero-Shot Learning0
Using time series and natural language processing to identify viral moments in the 2016 U.S. Presidential Debate0
Using virtual edges to extract keywords from texts modeled as complex networks0
Show:102550
← PrevPage 263 of 401Next →

No leaderboard results yet.