SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 941950 of 4002 papers

TitleStatusHype
SA2SL: From Aspect-Based Sentiment Analysis to Social Listening System for Business IntelligenceCode1
An Explanatory Query-Based Framework for Exploring Academic Expertise0
Semantic Frame Induction using Masked Word Embeddings and Two-Step Clustering0
RAW-C: Relatedness of Ambiguous Words--in Context (A New Lexical Resource for English)Code0
Inspecting the concept knowledge graph encoded by modern language models0
A data-driven strategy to combine word embeddings in information retrieval0
Word Embedding Transformation for Robust Unsupervised Bilingual Lexicon Induction0
HIN-RNN: A Graph Representation Learning Neural Network for Fraudster Group Detection With No Handcrafted Features0
ViBERTgrid: A Jointly Trained Multi-Modal 2D Document Representation for Key Information Extraction from Documents0
View Distillation with Unlabeled Data for Extracting Adverse Drug Effects from User-Generated Data0
Show:102550
← PrevPage 95 of 401Next →

No leaderboard results yet.