Random Feature Neural Networks: Learning Black-Scholes Type PDEs Without the Curse of Dimensionality
By Lukas Gonon; 24(189):1−51, 2023.
Abstract
This study explores the application of random feature neural networks in learning Kolmogorov partial (integro-)differential equations associated with Black-Scholes and exponential Lévy models. Random feature neural networks are single-hidden-layer feedforward neural networks, where the hidden weights are randomly generated and only the output weights are trainable. This simplicity in training is accompanied by reduced expressivity, but interestingly, this limitation does not apply to certain Black-Scholes type PDEs, as demonstrated in this article. We establish error bounds for the prediction accuracy of random neural networks when learning sufficiently non-degenerate Black-Scholes type models. A comprehensive error analysis is provided, encompassing the approximation, generalization, and optimization errors of the algorithm. Importantly, it is shown that the derived bounds overcome the curse of dimensionality. We also explore the application of these findings to basket options and validate the error bounds numerically. These results demonstrate that neural networks can effectively learn solutions to suitable Black-Scholes type PDEs without suffering from the curse of dimensionality. Furthermore, this study presents an example of a relevant learning problem where random feature neural networks are proven to be efficient.
[abs]