SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 12111220 of 4002 papers

TitleStatusHype
Misspelling Correction with Pre-trained Contextual Language Model0
Graph-of-Tweets: A Graph Merging Approach to Sub-event IdentificationCode0
Political Depolarization of News Articles Using Attribute-aware Word Embeddings0
Integration of Domain Knowledge using Medical Knowledge Graph Deep Learning for Cancer Phenotyping0
Lex-BERT: Enhancing BERT based NER with lexicons0
Kernel Methods in Hyperbolic Spaces0
Faster Training of Word Embeddings0
Key Phrase Extraction & Applause Prediction0
Ruminating Word Representations with Random Noise Masking0
Text Document Clustering: Wordnet vs. TF-IDF vs. Word Embeddings0
Show:102550
← PrevPage 122 of 401Next →

No leaderboard results yet.