DOI: 10.3390/electronics14010182 ISSN: 2079-9292

A Feature Map Fusion Self-Distillation Scheme for Image Classification Networks

Zhenkai Qin, Shuiping Ni, Mingfu Zhu, Yue Jia, Shangxin Liu, Yawei Chen

Self-distillation has been widely applied in the field of deep learning. However, the lack of interaction between the multiple shallow branches in the self-distillation framework reduces the effectiveness of self-distillation methods. To address this issue, a feature map fusion self-distillation scheme is proposed. According to the depth of the teacher model, multiple shallow branches as student models are constructed to build a self-distillation framework. Then, the feature map fusion module fuses the intermediate feature maps of each branch to enhance the interaction between the branches. Specifically, this fusion module employs a spatial enhancement module to generate attention masks for multiple feature maps, which are averaged and applied to create intermediate maps. The mean of these intermediate maps results in the final fusion feature map. The experimental findings on the CIFAR10 and CIFAR100 datasets illustrate that our proposed technique has clear advantages in increasing the classification accuracy of the deep learning models. On average, 0.7% and 2.5% accuracy boosts are observed on the CIFAR10 and CIFAR100.

More from our Archive