SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 22512300 of 4002 papers

TitleStatusHype
Multi-Label Image Recognition with Graph Convolutional NetworksCode0
ThisIsCompetition at SemEval-2019 Task 9: BERT is unstable for out-of-domain samples0
Exploring Fine-Tuned Embeddings that Model Intensifiers for Emotion Analysis0
Effective Context and Fragment Feature Usage for Named Entity Recognition0
Alternative Weighting Schemes for ELMo EmbeddingsCode0
Density Matching for Bilingual Word EmbeddingCode0
Text Classification Components for Detecting Descriptions and Names of CAD models0
ReWE: Regressing Word Embeddings for Regularization of Neural Machine Translation Systems0
Generative Adversarial Networks for text using word2vec intermediariesCode0
Unsupervised Domain Adaptation of Contextualized Embeddings for Sequence LabelingCode0
Probing Biomedical Embeddings from Language ModelsCode0
Evaluating KGR10 Polish word embeddings in the recognition of temporal expressions using BiLSTM-CRF0
Black is to Criminal as Caucasian is to Police: Detecting and Removing Multiclass Bias in Word EmbeddingsCode0
Identification, Interpretability, and Bayesian Word EmbeddingsCode0
Attentive Mimicking: Better Word Embeddings by Attending to Informative ContextsCode0
Adaptation of Hierarchical Structured Models for Speech Act Recognition in Asynchronous Conversation0
Unsupervised Abbreviation Disambiguation Contextual disambiguation using word embeddings0
Multimodal Machine Translation with Embedding PredictionCode0
SART - Similarity, Analogies, and Relatedness for Tatar Language: New Benchmark Datasets for Word Embeddings EvaluationCode0
Acoustically Grounded Word Embeddings for Improved Acoustics-to-Word Speech Recognition0
Integrating Semantic Knowledge to Tackle Zero-shot Text ClassificationCode0
Learning semantic sentence representations from visually grounded language without lexical knowledgeCode0
Deep Learning and Word Embeddings for Tweet Classification for Crisis Response0
On Measuring Social Biases in Sentence EncodersCode0
Question Embeddings Based on Shannon Entropy: Solving intent classification task in goal-oriented dialogue systemCode0
Expanding the Text Classification Toolbox with Cross-Lingual Embeddings0
LINSPECTOR: Multilingual Probing Tasks for Word RepresentationsCode0
Learning Entity Representations for Few-Shot Reconstruction of Wikipedia Categories0
Personalized Neural Embeddings for Collaborative Filtering with Text0
ETNLP: a visual-aided systematic approach to select pre-trained embeddings for a downstream taskCode0
Lipstick on a Pig: Debiasing Methods Cover up Systematic Gender Biases in Word Embeddings But do not Remove ThemCode0
Context-Aware Cross-Lingual MappingCode0
Creation and Evaluation of Datasets for Distributional Semantics Tasks in the Digital Humanities Domain0
Improving Cross-Domain Chinese Word Segmentation with Word EmbeddingsCode0
Russian Language Datasets in the Digitial Humanities Domain and Their Evaluation with Word EmbeddingsCode0
Relation Extraction Datasets in the Digital Humanities Domain and their Evaluation with Word Embeddings0
Using Word Embeddings for Visual Data Exploration with Ontodia and Wikidata0
Using natural language processing techniques to extract information on the properties and functionalities of energetic materials from large text corporaCode0
Efficient Contextual Representation Learning Without Softmax Layer0
A Framework for Decoding Event-Related Potentials from Text0
Still a Pain in the Neck: Evaluating Text Representations on Lexical CompositionCode0
Interpretable Structure-aware Document Encoders with Hierarchical Attention0
Context Vectors are Reflections of Word Vectors in Half the Dimensions0
SuperTML: Two-Dimensional Word Embedding for the Precognition on Structured Tabular DataCode0
Cross-Lingual Alignment of Contextual Word Embeddings, with Applications to Zero-shot Dependency ParsingCode0
Leveraging Deep Graph-Based Text Representation for Sentiment Polarity Applications0
Vector of Locally-Aggregated Word Embeddings (VLAWE): A Novel Document-level RepresentationCode0
VCWE: Visual Character-Enhanced Word EmbeddingsCode0
Enhancing Clinical Concept Extraction with Contextual Embeddings0
Learned In Speech Recognition: Contextual Acoustic Word Embeddings0
Show:102550
← PrevPage 46 of 81Next →

No leaderboard results yet.