Stochastic Perturbations of Tabular Features for Non-Deterministic Inference with Automunge
2022-02-18Code Available0· sign in to hype
Nicholas J. Teague
Code Available — Be the first to reproduce this paper.
ReproduceCode
Abstract
Injecting gaussian noise into training features is well known to have regularization properties. This paper considers noise injections to numeric or categoric tabular features as passed to inference, which translates inference to a non-deterministic outcome and may have relevance to fairness considerations, adversarial example protection, or other use cases benefiting from non-determinism. We offer the Automunge library for tabular preprocessing as a resource for the practice, which includes options to integrate random sampling or entropy seeding with the support of quantum circuits, representing a new way to channel quantum algorithms into classical learning.