SOTAVerified

Proper-Composite Loss Functions in Arbitrary Dimensions

2019-02-19Unverified0· sign in to hype

Zac Cranko, Robert C. Williamson, Richard Nock

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

The study of a machine learning problem is in many ways is difficult to separate from the study of the loss function being used. One avenue of inquiry has been to look at these loss functions in terms of their properties as scoring rules via the proper-composite representation, in which predictions are mapped to probability distributions which are then scored via a scoring rule. However, recent research so far has primarily been concerned with analysing the (typically) finite-dimensional conditional risk problem on the output space, leaving aside the larger total risk minimisation. We generalise a number of these results to an infinite dimensional setting and in doing so we are able to exploit the familial resemblance of density and conditional density estimation to provide a simple characterisation of the canonical link.

Tasks

Reproductions