SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 14211430 of 4002 papers

TitleStatusHype
Neural-DINF: A Neural Network based Framework for Measuring Document Influence0
In Neural Machine Translation, What Does Transfer Learning Transfer?0
Adaptive Compression of Word Embeddings0
Estimating Mutual Information Between Dense Word Embeddings0
Entity-Aware Dependency-Based Deep Graph Attention Network for Comparative Preference Classification0
He said ``who's gonna take care of your children when you are at ACL?'': Reported Sexist Acts are Not Sexist0
Transition-based Semantic Dependency Parsing with Pointer Networks0
Non-Linear Instance-Based Cross-Lingual Mapping for Non-Isomorphic Embedding Spaces0
Interpreting Pretrained Contextualized Representations via Reductions to Static Embeddings0
A Graph-based Coarse-to-fine Method for Unsupervised Bilingual Lexicon Induction0
Show:102550
← PrevPage 143 of 401Next →

No leaderboard results yet.