SOTAVerified

Fully trainable Gaussian derivative convolutional layer

2022-07-18Code Available0· sign in to hype

Valentin Penaud--Polge, Santiago Velasco-Forero, Jesus Angulo

Code Available — Be the first to reproduce this paper.

Reproduce

Code

Abstract

The Gaussian kernel and its derivatives have already been employed for Convolutional Neural Networks in several previous works. Most of these papers proposed to compute filters by linearly combining one or several bases of fixed or slightly trainable Gaussian kernels with or without their derivatives. In this article, we propose a high-level configurable layer based on anisotropic, oriented and shifted Gaussian derivative kernels which generalize notions encountered in previous related works while keeping their main advantage. The results show that the proposed layer has competitive performance compared to previous works and that it can be successfully included in common deep architectures such as VGG16 for image classification and U-net for image segmentation.

Tasks

Reproductions