Avoiding local minima in variational quantum eigensolvers with the natural gradient optimizer
release_d3khiykvjzfozacl7r7ydz5hfq
by
David Wierichs, Christian Gogolin, Michael Kastoryano
2020
Abstract
We compare the BFGS optimizer, ADAM and Natural Gradient Descent (NatGrad) in
the context of Variational Quantum Eigensolvers (VQEs). We systematically
analyze their performance on the QAOA ansatz for the Transverse Field Ising
Model (TFIM) as well as on overparametrized circuits with the ability to break
the symmetry of the Hamiltonian. The BFGS algorithm is frequently unable to
find a global minimum for systems beyond about 20 spins and ADAM easily gets
trapped in local minima. On the other hand, NatGrad shows stable performance on
all considered system sizes, albeit at a significantly higher cost per epoch.
In sharp contrast to most classical gradient based learning, the performance of
all optimizers is found to decrease upon seemingly benign overparametrization
of the ansatz class, with BFGS and ADAM failing more often and more severely
than NatGrad. Additional tests for the Heisenberg XXZ model corroborate the
accuracy problems of BFGS in high dimensions, but they reveal some shortcomings
of NatGrad as well. Our results suggest that great care needs to be taken in
the choice of gradient based optimizers and the parametrization for VQEs.
In text/plain
format
Archived Files and Locations
application/pdf 1.4 MB
file_z474mzupzzhpjieohnbx3jajh4
|
arxiv.org (repository) web.archive.org (webarchive) |
2004.14666v1
access all versions, variants, and formats of this works (eg, pre-prints)