SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 18761900 of 4002 papers

TitleStatusHype
Learn Interpretable Word Embeddings Efficiently with von Mises-Fisher Distribution0
On Understanding Knowledge Graph Representation0
Extremely Small BERT Models from Mixed-Vocabulary Training0
Classification Attention for Chinese NER0
Multi-source Multi-view Transfer Learning in Neural Topic Modeling with Pretrained Topic and Word Embeddings0
Distilled embedding: non-linear embedding factorization using knowledge distillation0
Interpreting Knowledge Graph Relation Representation from Word Embeddings0
DeepXML: Scalable & Accurate Deep Extreme Classification for Matching User Queries to Advertiser Bid Phrases0
Code-switching Language Modeling With Bilingual Word Embeddings: A Case Study for Egyptian Arabic-English0
Deep Text Mining of Instagram Data Without Strong SupervisionCode0
GNTeam at 2018 n2c2: Feature-augmented BiLSTM-CRF for drug-related entity recognition in hospital discharge summariesCode0
Generating Timelines by Modeling Semantic ChangeCode0
Low-Rank Approximation of Matrices for PMI-based Word Embeddings0
Cross-lingual Dependency Parsing with Unlabeled Auxiliary LanguagesCode0
Triplet-Aware Scene Graph Embeddings0
Multi-sense Definition Modeling using Word Sense Decompositions0
CogniVal: A Framework for Cognitive Word Embedding EvaluationCode0
Decision-Directed Data DecompositionCode0
Cross-Lingual Contextual Word Embeddings Mapping With Multi-Sense Words In Mind0
Corporate IT-support Help-Desk Process Hybrid-Automation Solution with Machine Learning Approach0
Hierarchical Meta-Embeddings for Code-Switching Named Entity RecognitionCode0
Do We Need Neural Models to Explain Human Judgments of Acceptability?0
Multi Sense Embeddings from Topic Models0
Cross-Lingual BERT Transformation for Zero-Shot Dependency ParsingCode0
Multi-view and Multi-source Transfers in Neural Topic Modeling with Pretrained Topic and Word Embeddings0
Show:102550
← PrevPage 76 of 161Next →

No leaderboard results yet.