The process of digitalizing analog signals involves sampling in time and discretizing in amplitude. However, this process introduces errors in the signal reconstruction, which are influenced by the resolution of amplitude and the density of acquired samples. While consistent signal reconstruction methods have shown promising results with increasing sampling rates, these findings are limited to offline settings. As a result, there is a research gap in methods for consistent signal reconstruction from data streams.

This paper introduces a method that addresses this gap by consistently reconstructing streamed multivariate time series with quantization intervals and a zero-delay response requirement. Previous studies have shown that exploiting temporal dependencies in univariate time series can improve zero-delay signal reconstructions. This work extends these findings by demonstrating that spatiotemporal dependencies within multivariate time series can also be utilized for better results.

To achieve this, the spatiotemporal dependencies of the multivariate time series are learned using a recurrent neural network. This learning process reduces the roughness of the signal reconstruction while maintaining consistency. Experimental results demonstrate that the proposed method achieves a favorable error-rate decay with increasing sampling rates compared to a similar reconstruction method that lacks consistency.