On the Insensitivity of Bit Density to Read Noise in One-bit Quanta Image Sensors
Stanley H. Chan
Unverified — Be the first to reproduce this paper.
ReproduceAbstract
The one-bit quanta image sensor is a photon-counting device that produces binary measurements where each bit represents the presence or absence of a photon. In the presence of read noise, the sensor quantizes the analog voltage into the binary bits using a threshold value q. The average number of ones in the bitstream is known as the bit-density and is often the sufficient statistics for signal estimation. An intriguing phenomenon is observed when the quanta exposure is at the unity and the threshold is q = 0.5. The bit-density demonstrates a complete insensitivity as long as the read noise level does not exceeds a certain limit. In other words, the bit density stays at a constant independent of the amount of read noise. This paper provides a mathematical explanation of the phenomenon by deriving conditions under which the phenomenon happens. It was found that the insensitivity holds when some forms of the symmetry of the underlying Poisson-Gaussian distribution holds.