SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 20412050 of 4002 papers

TitleStatusHype
A Transparent Framework for Evaluating Unintended Demographic Bias in Word Embeddings0
LSTMEmbed: Learning Word and Sense Representations from a Large Semantically Annotated Corpus with Long Short-Term Memories0
Towards Automating Healthcare Question Answering in a Noisy Multilingual Low-Resource Setting0
Towards Unsupervised Text Classification Leveraging Experts and Word Embeddings0
Unsupervised Parallel Sentence Extraction with Parallel Segment Detection Helps Machine TranslationCode0
Few-Shot Representation Learning for Out-Of-Vocabulary WordsCode0
Multilingual, Multi-scale and Multi-layer Visualization of Intermediate Representations0
Learning to Rank Broad and Narrow Queries in E-Commerce0
Supervised Contextual Embeddings for Transfer Learning in Natural Language Processing TasksCode0
Is It Worth the Attention? A Comparative Evaluation of Attention Layers for Argument Unit SegmentationCode0
Show:102550
← PrevPage 205 of 401Next →

No leaderboard results yet.