SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 18511860 of 4002 papers

TitleStatusHype
Deep Text Mining of Instagram Data Without Strong SupervisionCode0
GNTeam at 2018 n2c2: Feature-augmented BiLSTM-CRF for drug-related entity recognition in hospital discharge summariesCode0
Generating Timelines by Modeling Semantic ChangeCode0
Low-Rank Approximation of Matrices for PMI-based Word Embeddings0
Cross-lingual Dependency Parsing with Unlabeled Auxiliary LanguagesCode0
Triplet-Aware Scene Graph Embeddings0
CogniVal: A Framework for Cognitive Word Embedding EvaluationCode0
Multi-sense Definition Modeling using Word Sense Decompositions0
Cross-Lingual Contextual Word Embeddings Mapping With Multi-Sense Words In Mind0
Corporate IT-support Help-Desk Process Hybrid-Automation Solution with Machine Learning Approach0
Show:102550
← PrevPage 186 of 401Next →

No leaderboard results yet.