SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 22762300 of 4002 papers

TitleStatusHype
Expanding the Text Classification Toolbox with Cross-Lingual Embeddings0
LINSPECTOR: Multilingual Probing Tasks for Word RepresentationsCode0
Learning Entity Representations for Few-Shot Reconstruction of Wikipedia Categories0
Personalized Neural Embeddings for Collaborative Filtering with Text0
ETNLP: a visual-aided systematic approach to select pre-trained embeddings for a downstream taskCode0
Lipstick on a Pig: Debiasing Methods Cover up Systematic Gender Biases in Word Embeddings But do not Remove ThemCode0
Context-Aware Cross-Lingual MappingCode0
Creation and Evaluation of Datasets for Distributional Semantics Tasks in the Digital Humanities Domain0
Improving Cross-Domain Chinese Word Segmentation with Word EmbeddingsCode0
Russian Language Datasets in the Digitial Humanities Domain and Their Evaluation with Word EmbeddingsCode0
Relation Extraction Datasets in the Digital Humanities Domain and their Evaluation with Word Embeddings0
Using Word Embeddings for Visual Data Exploration with Ontodia and Wikidata0
Using natural language processing techniques to extract information on the properties and functionalities of energetic materials from large text corporaCode0
Efficient Contextual Representation Learning Without Softmax Layer0
A Framework for Decoding Event-Related Potentials from Text0
Still a Pain in the Neck: Evaluating Text Representations on Lexical CompositionCode0
Interpretable Structure-aware Document Encoders with Hierarchical Attention0
Context Vectors are Reflections of Word Vectors in Half the Dimensions0
SuperTML: Two-Dimensional Word Embedding for the Precognition on Structured Tabular DataCode0
Cross-Lingual Alignment of Contextual Word Embeddings, with Applications to Zero-shot Dependency ParsingCode0
Leveraging Deep Graph-Based Text Representation for Sentiment Polarity Applications0
Vector of Locally-Aggregated Word Embeddings (VLAWE): A Novel Document-level RepresentationCode0
VCWE: Visual Character-Enhanced Word EmbeddingsCode0
Enhancing Clinical Concept Extraction with Contextual Embeddings0
Learned In Speech Recognition: Contextual Acoustic Word Embeddings0
Show:102550
← PrevPage 92 of 161Next →

No leaderboard results yet.