Unit Tests for Stochastic Optimization
release_5zkjnyuttrh2ros32ghpf4hrfe
by
Tom Schaul, Ioannis Antonoglou, David Silver
2014
Abstract
Optimization by stochastic gradient descent is an important component of many
large-scale machine learning algorithms. A wide variety of such optimization
algorithms have been devised; however, it is unclear whether these algorithms
are robust and widely applicable across many different optimization landscapes.
In this paper we develop a collection of unit tests for stochastic
optimization. Each unit test rapidly evaluates an optimization algorithm on a
small-scale, isolated, and well-understood difficulty, rather than in
real-world scenarios where many such issues are entangled. Passing these unit
tests is not sufficient, but absolutely necessary for any algorithms with
claims to generality or robustness. We give initial quantitative and
qualitative results on numerous established algorithms. The testing framework
is open-source, extensible, and easy to apply to new algorithms.
In text/plain
format
Archived Files and Locations
application/pdf 8.7 MB
file_lgwdysnc7jd5hkcu4oxsjzqwhm
|
arxiv.org (repository) web.archive.org (webarchive) |
1312.6055v3
access all versions, variants, and formats of this works (eg, pre-prints)