Neural Networks Optimally Compress the Sawbridge

Aaron B. Wagner
2021 Data Compression Conf. (DCC) (to appear)

Abstract

Neural-network-based compressors have proven to be remarkably effective
at compressing those sources, such as images, that are nominally
high-dimensional but presumed to be concentrated on a low-dimensional
manifold. We consider a continuous-time random process
that models an extreme version of such a source,
wherein the realizations fall along a one-dimensional "curve"
in function space that has infinite-dimensional linear span. We
precisely characterize the optimal entropy-distortion tradeoff
for this source and show numerically that it achieved by
neural-network-based compressors trained with stochastic gradient
descent. In contrast, we show both analytically and experimentally
that classical compressors based on the Karhunen-Loève transform
are highly suboptimal at high rates.