NASIB: Neural Architecture Search withIn Budget release_5s3solxtjjhddo32up665fnmrm

by Abhishek Singh, Anubhav Garg, Jinan Zhou, Shiv Ram Dubey, Debo Dutta

Released as a article .

2019  

Abstract

Neural Architecture Search (NAS) represents a class of methods to generate the optimal neural network architecture and typically iterate over candidate architectures till convergence over some particular metric like validation loss. They are constrained by the available computation resources, especially in enterprise environments. In this paper, we propose a new approach for NAS, called NASIB, which adapts and attunes to the computation resources (budget) available by varying the exploration vs. exploitation trade-off. We reduce the expert bias by searching over an augmented search space induced by Superkernels. The proposed method can provide the architecture search useful for different computation resources and different domains beyond image classification of natural images where we lack bespoke architecture motifs and domain expertise. We show, on CIFAR10, that itis possible to search over a space that comprises of 12x more candidate operations than the traditional prior art in just 1.5 GPU days, while reaching close to state of the art accuracy. While our method searches over an exponentially larger search space, it could lead to novel architectures that require lesser domain expertise, compared to the majority of the existing methods.
In text/plain format

Archived Files and Locations

application/pdf  476.4 kB
file_j7v7khvetjh5jpuno6teos2wzy
arxiv.org (repository)
web.archive.org (webarchive)
Read Archived PDF
Preserved and Accessible
Type  article
Stage   submitted
Date   2019-10-19
Version   v1
Language   en ?
arXiv  1910.08665v1
Work Entity
access all versions, variants, and formats of this works (eg, pre-prints)
Catalog Record
Revision: 15e68596-c9cc-4675-be79-917e52b4ecb2
API URL: JSON