A Fast Neural Architecture Search Method for Multi-Modal Classification via Knowledge Sharing

Authors: Zhihua Cui, Shiwu Sun, Qian Guo, Xinyan Liang, Yuhua Qian, Zhixia Zhang

Abstract:

Neural architecture search-based multi-modal classification (NAS-MMC) aims to automatically find optimal network structures for improving the multi-modal classification performance. However, most current NAS-MMC methods are quite timeconsuming during the training process. In this paper, we propose a knowledge sharing-based neural architecture search (KS-NAS) method for multimodal classification. The KS-NAS optimizes the search process by introducing a dynamically updated knowledge base to reduce the consumption of computational resource. Specifically, during the deep evolutionary search, individuals in the initial population acquire initial parameters from a knowledge base, and then undergo training and optimization until convergence is reached, avoiding the need for training from scratch. The knowledge base is dynamically updated by aggregating the parameters of high-quality individuals trained within the population, thus progressively improving the quality of the knowledge base. As the population evolves, the knowledge base continues to optimize, ensuring that subsequent individuals can obtain higherquality initialization parameters, which significantly accelerates the training speed of the population. Experimental results show that the KS-NAS method achieves state-of-the-art results in terms of classification performance and training efficiency across multiple popular multi-modal tasks.

Keywords:

Thu Apr 03 11:49:00 CST 2025