Bridging the Gap between Sample-based and One-shot Neural Architecture Search with BONAS
release_md2fzod77zg55lf7zxmbzewama
by
Han Shi, Renjie Pi, Hang Xu, Zhenguo Li, James T. Kwok, Tong Zhang
2020
Abstract
Neural Architecture Search (NAS) has shown great potentials in finding better
neural network designs. Sample-based NAS is the most reliable approach which
aims at exploring the search space and evaluating the most promising
architectures. However, it is computationally very costly. As a remedy, the
one-shot approach has emerged as a popular technique for accelerating NAS using
weight-sharing. However, due to the weight-sharing of vastly different
networks, the one-shot approach is less reliable than the sample-based
approach. In this work, we propose BONAS (Bayesian Optimized Neural
Architecture Search), a sample-based NAS framework which is accelerated using
weight-sharing to evaluate multiple related architectures simultaneously.
Specifically, we apply Graph Convolutional Network predictor as a surrogate
model for Bayesian Optimization to select multiple related candidate models in
each iteration. We then apply weight-sharing to train multiple candidate models
simultaneously. This approach not only accelerates the traditional sample-based
approach significantly, but also keeps its reliability. This is because
weight-sharing among related architectures are more reliable than those in the
one-shot approach. Extensive experiments are conducted to verify the
effectiveness of our method over many competing algorithms.
In text/plain
format
Archived Content
There are no accessible files associated with this release. You could check other releases for this work for an accessible version.
Know of a fulltext copy of on the public web? Submit a URL and we will archive it
1911.09336v4
access all versions, variants, and formats of this works (eg, pre-prints)