Optimizing ROC Curves with a Sort-Based Surrogate Loss for Binary Classification and Changepoint Detection

Authors: Jonathan Hillman, Toby Dylan Hocking; Published in 24(70):1−24, 2023.

Abstract

Receiver Operating Characteristic (ROC) curves are valuable in evaluating binary classification models, but their use in learning is challenging due to the piecewise constant nature of the Area Under the Curve (AUC) in relation to predicted values. ROC curves can also be applied to other problems involving false positive and true positive rates, such as changepoint detection. In this broader context, we demonstrate that the ROC curve may exhibit loops, points with highly sub-optimal error rates, and AUC values greater than one. This observation leads us to propose a new optimization objective: rather than maximizing AUC, we aim for a monotonic ROC curve with AUC=1 that avoids points with large values for Min(FP,FN) (the minimum of false positives and false negatives). To achieve this, we introduce an L1 relaxation of the objective, resulting in a new surrogate loss function called the AUM (Area Under Min(FP, FN)). Unlike previous loss functions that involve summing over all labeled examples or pairs, the AUM requires sorting and summing over the sequence of points on the ROC curve. We demonstrate that AUM directional derivatives can be efficiently computed and utilized in a gradient descent learning algorithm. Through our empirical study of supervised binary classification and changepoint detection problems, we show that our new AUM minimization learning algorithm improves AUC and speed compared to previous baselines.

[abs]

[pdf][bib]
      
[code]