SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 14011450 of 4002 papers

TitleStatusHype
Explainable Depression Detection with Multi-Modalities Using a Hybrid Deep Learning Model on Social Media0
How Self-Attention Improves Rare Class Performance in a Question-Answering Dialogue Agent0
COVID-19 and Arabic Twitter: How can Arab World Governments and Public Health Organizations Learn from Social Media?0
Whole-Word Segmental Speech Recognition with Acoustic Word EmbeddingsCode0
Automated Scoring of Clinical Expressive Language Evaluation Tasks0
Adversarial Evaluation of BERT for Biomedical Named Entity Recognition0
Visual Question Generation from Radiology ImagesCode1
Analyzing the Framing of 2020 Presidential Candidates in the News0
CopyBERT: A Unified Approach to Question Generation with Self-Attention0
Contextual and Non-Contextual Word Embeddings: an in-depth Linguistic Investigation0
Word Embeddings as Tuples of Feature Probabilities0
Neural Metaphor Detection with a Residual biLSTM-CRF Model0
Getting the \#\#life out of living: How Adequate Are Word-Pieces for Modelling Complex Morphology?0
Metaphor Detection Using Contextual Word Embeddings From Transformers0
Evaluating Natural Alpha Embeddings on Intrinsic and Extrinsic Tasks0
Improving Biomedical Analogical Retrieval with Embedding of Structural Dependencies0
Quantifying 60 Years of Gender Bias in Biomedical Research with Word Embeddings0
Character aware models with similarity learning for metaphor detection0
LIT Team's System Description for Japanese-Chinese Machine Translation Task in IWSLT 20200
Token Level Identification of Multiword Expressions Using Contextual Information0
Non-Linear Instance-Based Cross-Lingual Mapping for Non-Isomorphic Embedding Spaces0
Neural-DINF: A Neural Network based Framework for Measuring Document Influence0
Predicting Degrees of Technicality in Automatic Terminology Extraction0
Adaptive Compression of Word Embeddings0
Estimating Mutual Information Between Dense Word Embeddings0
Entity-Aware Dependency-Based Deep Graph Attention Network for Comparative Preference Classification0
Transition-based Semantic Dependency Parsing with Pointer Networks0
Interpreting Pretrained Contextualized Representations via Reductions to Static Embeddings0
A Graph-based Coarse-to-fine Method for Unsupervised Bilingual Lexicon Induction0
He said ``who's gonna take care of your children when you are at ACL?'': Reported Sexist Acts are Not Sexist0
In Neural Machine Translation, What Does Transfer Learning Transfer?0
OSCaR: Orthogonal Subspace Correction and Rectification of Biases in Word EmbeddingsCode1
Rethinking Positional Encoding in Language Pre-trainingCode1
Multilingual Jointly Trained Acoustic and Written Word EmbeddingsCode1
Supervised Understanding of Word Embeddings0
Using Company Specific Headlines and Convolutional Neural Networks to Predict Stock Fluctuations0
Dirichlet-Smoothed Word Embeddings for Low-Resource Settings0
MDR Cluster-Debias: A Nonlinear WordEmbedding Debiasing Pipeline0
Learning aligned embeddings for semi-supervised word translation using Maximum Mean Discrepancy0
On the Learnability of Concepts: With Applications to Comparing Word Embedding Algorithms0
Evaluating a Multi-sense Definition Generation Model for Multiple Languages0
Attention improves concentration when learning node embeddings0
A Monolingual Approach to Contextualized Word Embeddings for Mid-Resource Languages0
ScoreGAN: A Fraud Review Detector based on Multi Task Learning of Regulated GAN with Data Augmentation0
Embed2Detect: Temporally Clustered Embedded Words for Event Detection in Social MediaCode1
CS-Embed at SemEval-2020 Task 9: The effectiveness of code-switched word embeddings for sentiment analysisCode0
Combining word embeddings and convolutional neural networks to detect duplicated questions0
ValNorm Quantifies Semantics to Reveal Consistent Valence Biases Across Languages and Over CenturiesCode0
Detecting Emergent Intersectional Biases: Contextualized Word Embeddings Contain a Distribution of Human-like BiasesCode1
DeCLUTR: Deep Contrastive Learning for Unsupervised Textual RepresentationsCode1
Show:102550
← PrevPage 29 of 81Next →

No leaderboard results yet.