SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 10611070 of 4002 papers

TitleStatusHype
An Introduction to Robust Graph Convolutional Networks0
Bilingual Distributed Word Representations from Document-Aligned Comparable Data0
Bilingual Correspondence Recursive Autoencoder for Statistical Machine Translation0
An Intrinsic Nearest Neighbor Analysis of Neural Machine Translation Architectures0
Aggression Identification and Multi Lingual Word Embeddings0
Bilingual Autoencoders with Global Descriptors for Modeling Parallel Sentences0
Bilexical Embeddings for Quality Estimation0
An Interpretable Deep-Learning Framework for Predicting Hospital Readmissions From Electronic Health Records0
An Improved Single Step Non-autoregressive Transformer for Automatic Speech Recognition0
Big Data Small Data, In Domain Out-of Domain, Known Word Unknown Word: The Impact of Word Representation on Sequence Labelling Tasks0
Show:102550
← PrevPage 107 of 401Next →

No leaderboard results yet.