SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 17011725 of 4002 papers

TitleStatusHype
Robust Cross-lingual Embeddings from Parallel SentencesCode0
Job Prediction: From Deep Neural Network Models to Applications0
Encoding word order in complex embeddingsCode0
One-Shot Weakly Supervised Video Object Segmentation0
Analyzing Structures in the Semantic Vector Space: A Framework for Decomposing Word EmbeddingsCode0
The performance evaluation of Multi-representation in the Deep Learning models for Relation Extraction Task0
Predicting the Outcome of Judicial Decisions made by the European Court of Human Rights0
A Comparison of Architectures and Pretraining Methods for Contextualized Multilingual Word Embeddings0
Artificial mental phenomena: Psychophysics as a framework to detect perception biases in AI models0
Integrating Lexical Knowledge in Word Embeddings using Sprinkling and Retrofitting0
Improving Interpretability of Word Embeddings by Generating Definition and Usage0
CoSimLex: A Resource for Evaluating Graded Word Similarity in ContextCode0
Machine Translation with Cross-lingual Word EmbeddingsCode0
Multilingual aspect clustering for sentiment analysisCode0
Can AI Generate Love Advice?: Toward Neural Answer Generation for Non-Factoid Questions0
Measuring Social Bias in Knowledge Graph Embeddings0
Massive vs. Curated Word Embeddings for Low-Resourced Languages. The Case of Yorùbá and TwiCode0
Natural Alpha Embeddings0
A Robust Self-Learning Method for Fully Unsupervised Cross-Lingual Mappings of Word Embeddings: Making the Method Robustly Reproducible as WellCode0
TU Wien @ TREC Deep Learning '19 -- Simple Contextualization for Re-rankingCode1
EduBERT: Pretrained Deep Language Models for Learning Analytics0
Incorporating Sub-Word Level Information in Language Invariant Neural Event Detection0
Deconstructing and reconstructing word embedding algorithms0
RETRO: Relation Retrofitting For In-Database Machine Learning on Textual Data0
Inducing Relational Knowledge from BERT0
Show:102550
← PrevPage 69 of 161Next →

No leaderboard results yet.