ESSR: Evolving Sparse Sharing Representation for Multi-task Learning

Authors: Yayu Zhang,Yuhua Qian,Guoshuai Ma, Xinyan Liang, Guoqing Liu, Qingfu Zhang,Ke Tang

Abstract:

Abstract—Multi-task learning uses knowledge transfer among tasks to improve the generalization performance of all tasks. For deep multi-task learning, knowledge transfer is often implemented via sharing all hidden features of tasks. A major shortcoming is that it can lead to negative knowledge transfer across tasks when task correlation is weak. To overcome it, this paper proposes an evolutionary method to learn sparse sharing representations adaptively. By embedding the neural network optimization into evolutionary multitasking, our proposed method finds an optimal combination of tasks and sharing features. It can identify negative correlation and redundant features and then remove them from the hidden feature set. Thus, an optimal sparse sharing subnetwork can be produced for each task. Experiment results show that the proposed method achieve better learning performance with a smaller inference model than other related methods.

Keywords: Multi-task learning, evolutionary multitasking optimization, knowledge transfer, sharing representation.

ESSR_Evolving_Sparse_Sharing_Representation_for_Multi-task_Learning (1).pdf

Tue Aug 29 09:48:00 CST 2023