Parameter Estimation with the Ordered ℓ_2 Regularization via an Alternating Direction Method of Multipliers release_djvgp76hkrczxggbbsslrzlt5i

by Mahammad Humayoo, Xueqi Cheng

Released as a article .

2019  

Abstract

Regularization is a popular technique in machine learning for model estimation and avoiding overfitting. Prior studies have found that modern ordered regularization can be more effective in handling highly correlated, high-dimensional data than traditional regularization. The reason stems from the fact that the ordered regularization can reject irrelevant variables and yield an accurate estimation of the parameters. How to scale up the ordered regularization problems when facing the large-scale training data remains an unanswered question. This paper explores the problem of parameter estimation with the ordered ℓ_2-regularization via Alternating Direction Method of Multipliers (ADMM), called ADMM-Oℓ_2. The advantages of ADMM-Oℓ_2 include (i) scaling up the ordered ℓ_2 to a large-scale dataset, (ii) predicting parameters correctly by excluding irrelevant variables automatically, and (iii) having a fast convergence rate. Experiment results on both synthetic data and real data indicate that ADMM-Oℓ_2 can perform better than or comparable to several state-of-the-art baselines.
In text/plain format

Archived Files and Locations

application/pdf  301.6 kB
file_tbq6didkvfcsrftoiubh5yikee
arxiv.org (repository)
web.archive.org (webarchive)
Read Archived PDF
Preserved and Accessible
Type  article
Stage   submitted
Date   2019-09-04
Version   v1
Language   en ?
arXiv  1909.01519v1
Work Entity
access all versions, variants, and formats of this works (eg, pre-prints)
Catalog Record
Revision: c69b6bc9-38f2-4e6f-b982-de1f44350123
API URL: JSON