Flexible Model Aggregation for Quantile Regression
Rasool Fakoor, Taesup Kim, Jonas Mueller, Alexander J. Smola, Ryan J. Tibshirani; 24(162):1−45, 2023.
Abstract
Quantile regression is a fundamental problem in statistical learning that aims to quantify uncertainty in predictions and model a diverse population without oversimplification. Various models have been developed for this problem over the years in statistics, machine learning, and related fields. Instead of proposing a new algorithm, this paper takes a meta viewpoint and explores methods for aggregating multiple conditional quantile models to enhance accuracy and robustness. The paper considers weighted ensembles with varying weights for individual models, quantile levels, and feature values. All the models discussed in this paper can be fitted using modern deep learning toolkits, making them widely accessible and scalable. To improve the accuracy of predicted quantiles or prediction intervals, the paper introduces tools for maintaining the monotonic ordering of quantiles and applying conformal calibration methods. These tools can be used without modifying the original library of base models. The paper also reviews basic theory on quantile aggregation and related scoring rules and presents new results, such as the finding that post sorting or post isotonic regression can only improve the weighted interval score. Additionally, the paper provides extensive empirical comparisons on 34 datasets from two benchmark repositories.
[abs]