Infinite-dimensional optimization and Bayesian nonparametric learning of stochastic differential equations
Authors: Arnab Ganguly, Riten Mitra, Jinpu Zhou; 24(159):1−39, 2023.
Abstract
This paper focuses on two main topics. The first part of the paper presents general results for infinite-dimensional optimization problems on Hilbert spaces. These results encompass the classical representer theorem and various related cases, providing a broader range of applications. The second part of the paper introduces a systematic approach to learning the drift function of a stochastic differential equation by combining the findings from the first part with a Bayesian hierarchical framework. The Bayesian approach incorporates low-cost sparse learning by utilizing shrinkage priors and allows for proper quantification of uncertainty through posterior distributions. The accuracy of the proposed learning scheme is demonstrated through several examples.
[abs]