SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 24212430 of 4002 papers

TitleStatusHype
Hybrid Code Networks using a convolutional neural network as an input layer achieves higher turn accuracy0
Hybrid Improved Document-level Embedding (HIDE)0
Hybrid Text Feature Modeling for Disease Group Prediction using Unstructured Physician Notes0
Hyperbolic Centroid Calculations for Text Classification0
Hyperspherical Query Likelihood Models with Word Embeddings0
Hypothesis Testing based Intrinsic Evaluation of Word Embeddings0
I2DFormer: Learning Image to Document Attention for Zero-Shot Image Classification0
ICL-HD at SemEval-2016 Task 10: Improving the Detection of Minimal Semantic Units and their Meanings with an Ontology and Word Embeddings0
Identification of Biased Terms in News Articles by Comparison of Outlet-specific Word Embeddings0
Identification of Indigenous Knowledge Concepts through Semantic Networks, Spelling Tools and Word Embeddings0
Show:102550
← PrevPage 243 of 401Next →

No leaderboard results yet.