SOTAVerified

Word Embeddings

Word embedding is the collective name for a set of language modeling and feature learning techniques in natural language processing (NLP) where words or phrases from the vocabulary are mapped to vectors of real numbers.

Techniques for learning word embeddings can include Word2Vec, GloVe, and other neural network-based approaches that train on an NLP task such as language modeling or document classification.

( Image credit: Dynamic Word Embedding for Evolving Semantic Discovery )

Papers

Showing 17211730 of 4002 papers

TitleStatusHype
HICEM: A High-Coverage Emotion Model for Artificial Emotional Intelligence0
Hierarchical Autoregressive Transformers: Combining Byte- and Word-Level Processing for Robust, Adaptable Language Models0
An evaluation of Czech word embeddings0
ConTextING: Granting Document-Wise Contextual Embeddings to Graph Neural Networks for Inductive Text Classification0
A Survey On Neural Word Embeddings0
Learning Meta Word Embeddings by Unsupervised Weighted Concatenation of Source Embeddings0
HIN-RNN: A Graph Representation Learning Neural Network for Fraudster Group Detection With No Handcrafted Features0
Analysis of Gender Bias in Social Perception and Judgement Using Chinese Word Embeddings0
HIT-SCIR at MRP 2019: A Unified Pipeline for Meaning Representation Parsing via Efficient Training and Effective Encoding0
Hypothesis Testing based Intrinsic Evaluation of Word Embeddings0
Show:102550
← PrevPage 173 of 401Next →

No leaderboard results yet.