Newton-Type Methods for Non-Convex Optimization Under Inexact Hessian Information release_bu7qgd44yzexpkjtwo4u5v237u

by Peng Xu, Fred Roosta, Michael W. Mahoney

Released as a article .

2019  

Abstract

We consider variants of trust-region and cubic regularization methods for non-convex optimization, in which the Hessian matrix is approximated. Under mild conditions on the inexact Hessian, and using approximate solution of the corresponding sub-problems, we provide iteration complexity to achieve ϵ -approximate second-order optimality which have shown to be tight. Our Hessian approximation conditions constitute a major relaxation over the existing ones in the literature. Consequently, we are able to show that such mild conditions allow for the construction of the approximate Hessian through various random sampling methods. In this light, we consider the canonical problem of finite-sum minimization, provide appropriate uniform and non-uniform sub-sampling strategies to construct such Hessian approximations, and obtain optimal iteration complexity for the corresponding sub-sampled trust-region and cubic regularization methods.
In text/plain format

Archived Files and Locations

application/pdf  457.0 kB
file_cujrmwnqazbrrgi32ccdutywka
arxiv.org (repository)
web.archive.org (webarchive)
Read Archived PDF
Preserved and Accessible
Type  article
Stage   submitted
Date   2019-05-14
Version   v4
Language   en ?
arXiv  1708.07164v4
Work Entity
access all versions, variants, and formats of this works (eg, pre-prints)
Catalog Record
Revision: b7dd14cd-7a5d-4488-a036-2db28ec93079
API URL: JSON