SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 23312340 of 4002 papers

TitleStatusHype
Combining neural and knowledge-based approaches to Named Entity Recognition in Polish0
Application of Clinical Concept Embeddings for Heart Failure Prediction in UK EHR data0
Inline Detection of Domain Generation Algorithms with Context-Sensitive Word Embeddings0
Learning Relation Representations from Word Representations0
NormCo: Deep Disease Normalization for Biomedical Knowledge Base Construction0
Joint Learning of Hierarchical Word Embeddings from a Corpus and a Taxonomy0
Correcting the Common Discourse Bias in Linear Representation of Sentences using Conceptors0
HCU400: An Annotated Dataset for Exploring Aural Phenomenology Through Causal UncertaintyCode0
A Deterministic Algorithm for Bridging Anaphora Resolution0
Few-shot Learning for Named Entity Recognition in Medical TextCode0
Show:102550
← PrevPage 234 of 401Next →

No leaderboard results yet.