A scaled gradient projection method for Bayesian learning in dynamical
systems
release_7cv2kusu3fef5hlcc6p5jzfiya
by
Silvia Bonettini and Alessandro Chiuso and Marco Prato
2014
Abstract
A crucial task in system identification problems is the selection of the most
appropriate model class, and is classically addressed resorting to
cross-validation or using asymptotic arguments. As recently suggested in the
literature, this can be addressed in a Bayesian framework, where model
complexity is regulated by few hyperparameters, which can be estimated via
marginal likelihood maximization. It is thus of primary importance to design
effective optimization methods to solve the corresponding optimization problem.
If the unknown impulse response is modeled as a Gaussian process with a
suitable kernel, the maximization of the marginal likelihood leads to a
challenging nonconvex optimization problem, which requires a stable and
effective solution strategy. In this paper we address this problem by means of
a scaled gradient projection algorithm, in which the scaling matrix and the
steplength parameter play a crucial role to provide a meaning solution in a
computational time comparable with second order methods. In particular, we
propose both a generalization of the split gradient approach to design the
scaling matrix in the presence of box constraints, and an effective
implementation of the gradient and objective function. The extensive numerical
experiments carried out on several test problems show that our method is very
effective in providing in few tenths of a second solutions of the problems with
accuracy comparable with state-of-the-art approaches. Moreover, the flexibility
of the proposed strategy makes it easily adaptable to a wider range of problems
arising in different areas of machine learning, signal processing and system
identification.
In text/plain
format
Archived Files and Locations
application/pdf 481.6 kB
file_5fmkh7dnhngeddxtvumggidcqm
|
arxiv.org (repository) web.archive.org (webarchive) |
1406.6603v1
access all versions, variants, and formats of this works (eg, pre-prints)