AlphaX: eXploring Neural Architectures with Deep Neural Networks and
Monte Carlo Tree Search
- BDL
We present AlphaX, a fully automated agent that designs complex neural architectures from scratch. AlphaX explores the search space with a distributed Monte Carlo Tree Search (MCTS) and a Meta-Deep Neural Network (DNN). MCTS guides transfer learning and intrinsically improves the search efficiency by dynamically balancing the exploration and exploitation at fine-grained states, while Meta-DNN predicts the network accuracy to guide the search, and to provide an estimated reward to speed up the rollout. As the search progresses, AlphaX also generates the training data for Meta-DNN. So, the learning of Meta-DNN is end-to-end. In 8 GPU days, AlphaX found an architecture that reaches 97.88\% top-1 accuracy on CIFAR-10, and 75.5\% top-1 accuracy on ImageNet. We also evaluate AlphaX on a large scale NAS dataset for reproducibility. On NASBench-101, AlphaX also demonstrates 3x and 2.8x speedup over \textit{Random Search} and \textit{Regularized Evolution} in finding the global optimum. Finally, we show the searched architecture improves a variety of vision applications from Neural Style Transfer, to Image Captioning and Object Detection.
View on arXiv