Smart "Predict, then Optimize" release_l7sdfs5qgzgftkbpjxtquel3fa

by Adam N. Elmachtoub, Paul Grigas

Released as a article .

(2020)

Abstract

Many real-world analytics problems involve two significant challenges: prediction and optimization. Due to the typically complex nature of each challenge, the standard paradigm is predict-then-optimize. By and large, machine learning tools are intended to minimize prediction error and do not account for how the predictions will be used in the downstream optimization problem. In contrast, we propose a new and very general framework, called Smart "Predict, then Optimize" (SPO), which directly leverages the optimization problem structure, i.e., its objective and constraints, for designing better prediction models. A key component of our framework is the SPO loss function which measures the decision error induced by a prediction. Training a prediction model with respect to the SPO loss is computationally challenging, and thus we derive, using duality theory, a convex surrogate loss function which we call the SPO+ loss. Most importantly, we prove that the SPO+ loss is statistically consistent with respect to the SPO loss under mild conditions. Our SPO+ loss function can tractably handle any polyhedral, convex, or even mixed-integer optimization problem with a linear objective. Numerical experiments on shortest path and portfolio optimization problems show that the SPO framework can lead to significant improvement under the predict-then-optimize paradigm, in particular when the prediction model being trained is misspecified. We find that linear models trained using SPO+ loss tend to dominate random forest algorithms, even when the ground truth is highly nonlinear.
In text/plain format

Archived Files and Locations

application/pdf  1.1 MB
file_rdguopmy3fcrzds6shqjbky7oe
web.archive.org (webarchive)
arxiv.org (repository)
Read Archived PDF
Archived
Type  article
Stage   submitted
Date   2020-07-09
Version   v4
Language   en ?
arXiv  1710.08005v4
Work Entity
access all versions, variants, and formats of this works (eg, pre-prints)
Catalog Record
Revision: c36d2355-7ff0-46f1-8ff4-8d52300a1bcc
API URL: JSON