SOTAVerified

Unsupervised Learning using Pretrained CNN and Associative Memory Bank

2018-05-02Unverified0· sign in to hype

Qun Liu, Supratik Mukhopadhyay

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

Deep Convolutional features extracted from a comprehensive labeled dataset, contain substantial representations which could be effectively used in a new domain. Despite the fact that generic features achieved good results in many visual tasks, fine-tuning is required for pretrained deep CNN models to be more effective and provide state-of-the-art performance. Fine tuning using the backpropagation algorithm in a supervised setting, is a time and resource consuming process. In this paper, we present a new architecture and an approach for unsupervised object recognition that addresses the above mentioned problem with fine tuning associated with pretrained CNN-based supervised deep learning approaches while allowing automated feature extraction. Unlike existing works, our approach is applicable to general object recognition tasks. It uses a pretrained (on a related domain) CNN model for automated feature extraction pipelined with a Hopfield network based associative memory bank for storing patterns for classification purposes. The use of associative memory bank in our framework allows eliminating backpropagation while providing competitive performance on an unseen dataset.

Tasks

Benchmark Results

DatasetModelMetricClaimedVerifiedStatus
Caltech-256 5-way (1-shot)UL-Hopfield (ULH)Accuracy74.7Unverified
CIFAR100 5-way (1-shot)UL-Hopfield (ULH)Accuracy89.6Unverified

Reproductions