Derivative-free optimization methods release_pvshhbwanvcttigqme3ju32qye

by Jeffrey Larson, Matt Menickelly, Stefan M. Wild

Released as a article .

2019  

Abstract

In many optimization problems arising from scientific, engineering and artificial intelligence applications, objective and constraint functions are available only as the output of a black-box or simulation oracle that does not provide derivative information. Such settings necessitate the use of methods for derivative-free, or zeroth-order, optimization. We provide a review and perspectives on developments in these methods, with an emphasis on highlighting recent developments and on unifying treatment of such problems in the non-linear optimization and machine learning literature. We categorize methods based on assumed properties of the black-box functions, as well as features of the methods. We first overview the primary setting of deterministic methods applied to unconstrained, non-convex optimization problems where the objective function is defined by a deterministic black-box oracle. We then discuss developments in randomized methods, methods that assume some additional structure about the objective (including convexity, separability and general non-smooth compositions), methods for problems where the output of the black-box oracle is stochastic, and methods for handling different types of constraints.
In text/plain format

Archived Files and Locations

application/pdf  2.3 MB
file_v4sgtxpvsvgytpx3gp736rwxxq
arxiv.org (repository)
web.archive.org (webarchive)
Read Archived PDF
Preserved and Accessible
Type  article
Stage   submitted
Date   2019-04-25
Version   v1
Language   en ?
arXiv  1904.11585v1
Work Entity
access all versions, variants, and formats of this works (eg, pre-prints)
Catalog Record
Revision: d97c4665-89ef-4cf1-9e27-bf84ff09bbc1
API URL: JSON