SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 10911100 of 4002 papers

TitleStatusHype
An Explanatory Query-Based Framework for Exploring Academic Expertise0
Semantic Frame Induction using Masked Word Embeddings and Two-Step Clustering0
Inspecting the concept knowledge graph encoded by modern language models0
RAW-C: Relatedness of Ambiguous Words--in Context (A New Lexical Resource for English)Code0
Word Embedding Transformation for Robust Unsupervised Bilingual Lexicon Induction0
A data-driven strategy to combine word embeddings in information retrieval0
ViBERTgrid: A Jointly Trained Multi-Modal 2D Document Representation for Key Information Extraction from Documents0
HIN-RNN: A Graph Representation Learning Neural Network for Fraudster Group Detection With No Handcrafted Features0
View Distillation with Unlabeled Data for Extracting Adverse Drug Effects from User-Generated Data0
Stance Detection with BERT Embeddings for Credibility Analysis of Information on Social Media0
Show:102550
← PrevPage 110 of 401Next →

No leaderboard results yet.