Quantum gradient descent and Newton's method for constrained polynomial
optimization
release_pmkxs3ju4bad3j3uqnr7n4ni6y
by
Patrick Rebentrost, Maria Schuld, Leonard Wossnig, Francesco
Petruccione, Seth Lloyd
2018
Abstract
Optimization problems in disciplines such as machine learning are commonly
solved with iterative methods. Gradient descent algorithms find local minima by
moving along the direction of steepest descent while Newton's method takes into
account curvature information and thereby often improves convergence. Here, we
develop quantum versions of these iterative optimization algorithms and apply
them to polynomial optimization with a unit norm constraint. In each step,
multiple copies of the current candidate are used to improve the candidate
using quantum phase estimation, an adapted quantum principal component analysis
scheme, as well as quantum matrix multiplications and inversions. The required
operations perform polylogarithmically in the dimension of the solution vector
and exponentially in the number of iterations. Therefore, the quantum algorithm
can be beneficial for high-dimensional problems where a small number of
iterations is sufficient.
In text/plain
format
Archived Files and Locations
application/pdf 1.6 MB
file_djd3rsna3ndato4cbkaxwywcvy
|
arxiv.org (repository) web.archive.org (webarchive) |
1612.01789v3
access all versions, variants, and formats of this works (eg, pre-prints)