SOTAVerified

Memory-Associated Differential Learning

2021-02-10Code Available0· sign in to hype

Yi Luo, Aiguo Chen, Bei Hui, Ke Yan

Code Available — Be the first to reproduce this paper.

Reproduce

Code

Abstract

Conventional Supervised Learning approaches focus on the mapping from input features to output labels. After training, the learnt models alone are adapted onto testing features to predict testing labels in isolation, with training data wasted and their associations ignored. To take full advantage of the vast number of training data and their associations, we propose a novel learning paradigm called Memory-Associated Differential (MAD) Learning. We first introduce an additional component called Memory to memorize all the training data. Then we learn the differences of labels as well as the associations of features in the combination of a differential equation and some sampling methods. Finally, in the evaluating phase, we predict unknown labels by inferencing from the memorized facts plus the learnt differences and associations in a geometrically meaningful manner. We gently build this theory in unary situations and apply it on Image Recognition, then extend it into Link Prediction as a binary situation, in which our method outperforms strong state-of-the-art baselines on ogbl-ddi dataset.

Tasks

Benchmark Results

DatasetModelMetricClaimedVerifiedStatus
ogbl-ddiMAD LearningNumber of params1,228,897Unverified

Reproductions