Power Function Error Initialization Can Improve Convergence of Backpropagation Learning in Neural Networks for Classification release_2gru2jkkuvbrzomrsf3jhgaime

by Andreas Knoblauch

Published in Neural Computation by MIT Press - Journals.

2021   Volume 33, Issue 8, p1-33

Abstract

<jats:title>Abstract</jats:title> Abstract supervised learning corresponds to minimizing a loss or cost function expressing the differences between model predictions yn and the target values tn given by the training data. In neural networks, this means backpropagating error signals through the transposed weight matrixes from the output layer toward the input layer. For this, error signals in the output layer are typically initialized by the difference yn - tn, which is optimal for several commonly used loss functions like cross-entropy or sum of squared errors. Here I evaluate a more general error initialization method using power functions |yn - tn|q for q&amp;gt;0, corresponding to a new family of loss functions that generalize cross-entropy. Surprisingly, experiments on various learning tasks reveal that a proper choice of q can significantly improve the speed and convergence of backpropagation learning, in particular in deep and recurrent neural networks. The results suggest two main reasons for the observed improvements. First, compared to cross-entropy, the new loss functions provide better fits to the distribution of error signals in the output layer and therefore maximize the model's likelihood more efficiently. Second, the new error initialization procedure may often provide a better gradient-to-loss ratio over a broad range of neural output activity, thereby avoiding flat loss landscapes with vanishing gradients.
In application/xml+jats format

Archived Files and Locations

application/pdf  1.1 MB
file_3s4wzycderg4dovxcabbsjapli
watermark.silverchair.com (publisher)
web.archive.org (webarchive)
Read Archived PDF
Preserved and Accessible
Type  article-journal
Stage   published
Date   2021-05-26
Language   en ?
Container Metadata
Not in DOAJ
In Keepers Registry
ISSN-L:  0899-7667
Work Entity
access all versions, variants, and formats of this works (eg, pre-prints)
Catalog Record
Revision: d4dcc3cf-ba36-4fae-a0f1-e85b82d887d7
API URL: JSON