GFlowNet Foundations
Yoshua Bengio, Salem Lahlou, Tristan Deleu, Edward J. Hu, Mo Tiwari, Emmanuel Bengio; 24(210):1−55, 2023.
Abstract
This paper presents additional theoretical properties of Generative Flow Networks (GFlowNets), a method introduced for sampling diverse candidates in an active learning context. The training objective of GFlowNets allows them to approximately sample according to a given reward function. The paper introduces a new local and efficient training objective called detailed balance, which draws an analogy with MCMC. GFlowNets can estimate joint probability distributions and corresponding marginal distributions, even when some variables are unspecified. They are particularly useful for representing distributions over composite objects such as sets and graphs. GFlowNets perform the work of computationally expensive MCMC methods in a single generative pass. They can also estimate partition functions, free energies, conditional probabilities of supersets given a subset, as well as marginal distributions over all supersets of a given set. The paper also introduces variations that enable the estimation of entropy, mutual information, continuous actions, and modular energy functions.
[abs]