Unit Tests for Stochastic Optimization release_5zkjnyuttrh2ros32ghpf4hrfe

by Tom Schaul, Ioannis Antonoglou, David Silver

Released as a article .

2014  

Abstract

Optimization by stochastic gradient descent is an important component of many large-scale machine learning algorithms. A wide variety of such optimization algorithms have been devised; however, it is unclear whether these algorithms are robust and widely applicable across many different optimization landscapes. In this paper we develop a collection of unit tests for stochastic optimization. Each unit test rapidly evaluates an optimization algorithm on a small-scale, isolated, and well-understood difficulty, rather than in real-world scenarios where many such issues are entangled. Passing these unit tests is not sufficient, but absolutely necessary for any algorithms with claims to generality or robustness. We give initial quantitative and qualitative results on numerous established algorithms. The testing framework is open-source, extensible, and easy to apply to new algorithms.
In text/plain format

Archived Files and Locations

application/pdf  8.7 MB
file_lgwdysnc7jd5hkcu4oxsjzqwhm
arxiv.org (repository)
web.archive.org (webarchive)
Read Archived PDF
Preserved and Accessible
Type  article
Stage   submitted
Date   2014-02-25
Version   v3
Language   en ?
arXiv  1312.6055v3
Work Entity
access all versions, variants, and formats of this works (eg, pre-prints)
Catalog Record
Revision: 4f6f9e2a-11ad-483e-90df-e44934415438
API URL: JSON