SOTAVerified

Value-aware transformers for 1.5d data

2021-09-29Unverified0· sign in to hype

James F Cann, Timothy J Roberts, Amy R Tso, Amy Nelson, Parashkev Nachev

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

Sparse sequential highly-multivariate data of the form characteristic of hospital in-patient investigation and treatment poses a considerable challenge for representation learning. Such data is neither faithfully reducible to 1d nor dense enough to constitute multivariate series. Conventional models compromise their data by requiring these forms at the point of input. Building on contemporary sequence-modelling architectures we design a value-aware transformer, prompting a reconceptualisation of our data as 1.5-dimensional: a token-value form both respecting its sequential nature and augmenting it with a quantifier. Experiments focused on sequential in-patient laboratory data up to 48hrs after hospital admission show that the value-aware transformer performs favourably versus competitive baselines on in-hospital mortality and length-of-stay prediction within the MIMIC-III dataset.

Tasks

Reproductions