Score-based and diffusion models have become popular in generating both conditional and unconditional content. However, conditional generation typically requires training a conditional model or utilizing classifier guidance, even when a classifier for uncorrupted data is available. In this study, we propose a novel method for sampling from unconditional score-based generative models while enforcing arbitrary logical constraints, without the need for additional training.
Our approach involves manipulating the learned score to sample from an un-normalized distribution based on a user-defined constraint. Additionally, we introduce a flexible and numerically stable neuro-symbolic framework for encoding soft logical constraints. By combining these two components, we develop a general yet approximate algorithm for conditional sampling. To enhance the approximation, we also introduce effective heuristics.
To validate the effectiveness of our approach, we conduct experiments on various types of data, including tabular data, images, and time series. We demonstrate that our method successfully handles different types of constraints, showcasing its versatility and applicability.