A Transformer-based Neural Architecture Search Method

This paper presents a neural architecture search method based on Transformer architecture, searching cross multihead attention computation ways for different number of encoder and decoder combinations. In order to search for neural network structures with better translation results, we considered perplexity as an auxiliary evaluation metric for the algorithm in addition to BLEU scores and iteratively improved each individual neural network within the population by a multi-objective genetic algorithm. Experimental results show that the neural network structures searched by the algorithm outperform all the baseline models, and that the introduction of the auxiliary evaluation metric can find better models than considering only the BLEU score as an evaluation metric.
View on arXiv@article{wang2025_2505.01314, title={ A Transformer-based Neural Architecture Search Method }, author={ Shang Wang and Huanrong Tang and Jianquan Ouyang }, journal={arXiv preprint arXiv:2505.01314}, year={ 2025 } }