Paper
10 October 2023 Feature distillation with mixup for unsupervised continual learning
Yiyan Gao
Author Affiliations +
Proceedings Volume 12799, Third International Conference on Advanced Algorithms and Signal Image Processing (AASIP 2023); 1279925 (2023) https://doi.org/10.1117/12.3005825
Event: 3rd International Conference on Advanced Algorithms and Signal Image Processing (AASIP 2023), 2023, Kuala Lumpur, Malaysia
Abstract
We learn feature representations on unlabeled task sequences. In order to make the disparity between unsupervised representational learning and continuous learning narrow, interpolate examples of current and previous tasks to build new examples. On this basis, we suggest a innovative knowledge extraction method which takes into account feature location and distance function of the extraction. We use the position before ReLU as the distillation point and design a new margin ReLU function. This allows for the centralization of useful information in the middle of the network and further performance improvement, minimizing the forgetting of past activities while increasing learning of new ones.
(2023) Published by SPIE. Downloading of the abstract is permitted for personal use only.
Yiyan Gao "Feature distillation with mixup for unsupervised continual learning", Proc. SPIE 12799, Third International Conference on Advanced Algorithms and Signal Image Processing (AASIP 2023), 1279925 (10 October 2023); https://doi.org/10.1117/12.3005825
Advertisement
Advertisement
RIGHTS & PERMISSIONS
Get copyright permission  Get copyright permission on Copyright Marketplace
KEYWORDS
Machine learning

Interpolation

Batch normalization

Deep learning

Feature extraction

Transform theory

Education and training

Back to Top