SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 17011750 of 4002 papers

TitleStatusHype
Robust Cross-lingual Embeddings from Parallel SentencesCode0
Job Prediction: From Deep Neural Network Models to Applications0
Encoding word order in complex embeddingsCode0
One-Shot Weakly Supervised Video Object Segmentation0
Analyzing Structures in the Semantic Vector Space: A Framework for Decomposing Word EmbeddingsCode0
The performance evaluation of Multi-representation in the Deep Learning models for Relation Extraction Task0
Predicting the Outcome of Judicial Decisions made by the European Court of Human Rights0
A Comparison of Architectures and Pretraining Methods for Contextualized Multilingual Word Embeddings0
Artificial mental phenomena: Psychophysics as a framework to detect perception biases in AI models0
Integrating Lexical Knowledge in Word Embeddings using Sprinkling and Retrofitting0
Improving Interpretability of Word Embeddings by Generating Definition and Usage0
CoSimLex: A Resource for Evaluating Graded Word Similarity in ContextCode0
Machine Translation with Cross-lingual Word EmbeddingsCode0
Multilingual aspect clustering for sentiment analysisCode0
Can AI Generate Love Advice?: Toward Neural Answer Generation for Non-Factoid Questions0
Massive vs. Curated Word Embeddings for Low-Resourced Languages. The Case of Yorùbá and TwiCode0
Measuring Social Bias in Knowledge Graph Embeddings0
Natural Alpha Embeddings0
A Robust Self-Learning Method for Fully Unsupervised Cross-Lingual Mappings of Word Embeddings: Making the Method Robustly Reproducible as WellCode0
TU Wien @ TREC Deep Learning '19 -- Simple Contextualization for Re-rankingCode1
EduBERT: Pretrained Deep Language Models for Learning Analytics0
Incorporating Sub-Word Level Information in Language Invariant Neural Event Detection0
Deconstructing and reconstructing word embedding algorithms0
Inducing Relational Knowledge from BERT0
RETRO: Relation Retrofitting For In-Database Machine Learning on Textual Data0
Word Embedding based New Corpus for Low-resourced Language: Sindhi0
DeFINE: DEep Factorized INput Token Embeddings for Neural Sequence ModelingCode1
Taking a Stance on Fake News: Towards Automatic Disinformation Assessment via Deep Bidirectional Transformer Language Models for Stance Detection0
Word-Class Embeddings for Multiclass Text ClassificationCode0
Hybrid Text Feature Modeling for Disease Group Prediction using Unstructured Physician Notes0
City2City: Translating Place Representations across Cities0
Visual Summarization of Scholarly Videos using Word Embeddings and Keyphrase Extraction0
A Causal Inference Method for Reducing Gender Bias in Word Embedding RelationsCode0
Towards robust word embeddings for noisy textsCode0
Causally Denoise Word Embeddings Using Half-Sibling RegressionCode0
Enhancing Out-Of-Domain Utterance Detection with Data Augmentation Based on Word Embeddings0
Anaphora Resolution in Dialogue Systems for South Asian Languages0
Multilingual Culture-Independent Word Analogy Datasets0
Topical Phrase Extraction from Clinical Reports by Incorporating both Local and Global Context0
Empirical Autopsy of Deep Video Captioning Frameworks0
SemanticZ at SemEval-2016 Task 3: Ranking Relevant Answers in Community Question Answering Using Semantic Similarity Based on Fine-tuned Word EmbeddingsCode0
Improving Document Classification with Multi-Sense EmbeddingsCode1
Error Analysis for Vietnamese Named Entity Recognition on Deep Neural Network Models0
Bootstrapping NLU Models with Multi-task Learning0
What do you mean, BERT? Assessing BERT as a Distributional Semantics Model0
Learning Relationships between Text, Audio, and Video via Deep Canonical Correlation for Multimodal Language Analysis0
Learning Multi-Sense Word Distributions using Approximate Kullback-Leibler Divergence0
How to Evaluate Word Representations of Informal Domain?Code0
word2ket: Space-efficient Word Embeddings inspired by Quantum EntanglementCode0
Contextualized End-to-End Neural Entity Linking0
Show:102550
← PrevPage 35 of 81Next →

No leaderboard results yet.