Transfer Learning with Neural AutoML

Catherine Wong
Neil Houlsby
Yifeng Lu
NIPS (2018)

Abstract

We reduce the computational cost of Neural AutoML with transfer learning. AutoML
relieves human effort by automating the design of ML algorithms. Neural
AutoML has become popular for the design of deep learning architectures, however,
this method has a high computation cost.To address this we propose Transfer
Neural AutoML that uses knowledge from prior tasks to speed up network design.
We extend RL-based architecture search methods to support parallel training on
multiple tasks and then transfer the search strategy to new tasks. On language and
image classification data, Transfer Neural AutoML reduces convergence time over
single-task training by over an order of magnitude on many tasks.

Research Areas