Parameter-free online learning via model selection release_3oom3h6aa5d4zjiswqihlptzfy

by Dylan J. Foster, Satyen Kale, Mehryar Mohri, Karthik Sridharan

Released as a article .

2018  

Abstract

We introduce an efficient algorithmic framework for model selection in online learning, also known as parameter-free online learning. Departing from previous work, which has focused on highly structured function classes such as nested balls in Hilbert space, we propose a generic meta-algorithm framework that achieves online model selection oracle inequalities under minimal structural assumptions. We give the first computationally efficient parameter-free algorithms that work in arbitrary Banach spaces under mild smoothness assumptions; previous results applied only to Hilbert spaces. We further derive new oracle inequalities for matrix classes, non-nested convex sets, and R^d with generic regularizers. Finally, we generalize these results by providing oracle inequalities for arbitrary non-linear classes in the online supervised learning model. These results are all derived through a unified meta-algorithm scheme using a novel "multi-scale" algorithm for prediction with expert advice based on random playout, which may be of independent interest.
In text/plain format

Archived Files and Locations

application/pdf  390.7 kB
file_565u47o67jhipave6jwvfqdkc4
arxiv.org (repository)
web.archive.org (webarchive)
Read Archived PDF
Preserved and Accessible
Type  article
Stage   submitted
Date   2018-01-03
Version   v2
Language   en ?
arXiv  1801.00101v2
Work Entity
access all versions, variants, and formats of this works (eg, pre-prints)
Catalog Record
Revision: 4abf2d02-891d-40a6-8366-5cf267071aae
API URL: JSON