Few-Shot Class-Incremental Learning (FSCIL) faces a huge stability-plasticity challenge due to continuously learning knowledge from new classes with a small number of training samples without forgetting the knowledge of previously seen old classes. To alleviate this challenge, we propose a novel method called Prompt-based Concept Learning (PCL) for FSCIL, which generalizes conceptual knowledge learned from old classes to new classes by simulating human learning capabilities. In our PCL, in the base session, we simultaneously learn common basic concepts from the training data and the class-concept weight of each class in a prompt learning manner, and in each incremental session, class-concept weights between new classes and previously learned basic concepts are learned to achieve incremental learning. Furthermore, in order to avoid catastrophic forgetting, we propose a distribution estimation module to retain feature distributions of previously seen classes and a data replay module to randomly sample features of previously seen classes in incremental sessions. We verify the effectiveness of our PCL on widely used benchmarks, such as miniImageNet, CIFAR-100, and CUB-200. Experimental results show that our PCL achieves competitive results compared with other state-of-the-art methods, especially we achieve an average accuracy of 94.02% across all sessions on the miniImageNet benchmark.
Support the authors with ResearchCoin