SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 701710 of 4002 papers

TitleStatusHype
Chemical Identification and Indexing in PubMed Articles via BERT and Text-to-Text Approaches0
Chinese Embedding via Stroke and Glyph Information: A Dual-channel View0
An Unsupervised Approach for Mapping between Vector Spaces0
Combining Long Short Term Memory and Convolutional Neural Network for Cross-Sentence n-ary Relation Extraction0
Chinese Zero Pronoun Resolution with Deep Neural Networks0
Are Girls Neko or Sh\=ojo? Cross-Lingual Alignment of Non-Isomorphic Embeddings with Iterative Normalization0
Combining Pretrained High-Resource Embeddings and Subword Representations for Low-Resource Languages0
BLISS in Non-Isometric Embedding Spaces0
CitiusNLP at SemEval-2020 Task 3: Comparing Two Approaches for Word Vector Contextualization0
Blinov: Distributed Representations of Words for Aspect-Based Sentiment Analysis at SemEval 20140
Show:102550
← PrevPage 71 of 401Next →

No leaderboard results yet.