Few-Shot Class-Incremental Learning via Feature Space Composition
release_voqnl3sfxzhy7mf6iqcopndxei
by
Hanbin Zhao, Yongjian Fu, Xuewei Li, Songyuan Li, Bourahla Omar, Xi Li
2020
Abstract
As a challenging problem in machine learning, few-shot class-incremental
learning asynchronously learns a sequence of tasks, acquiring the new knowledge
from new tasks (with limited new samples) while keeping the learned knowledge
from previous tasks (with old samples discarded). In general, existing
approaches resort to one unified feature space for balancing old-knowledge
preserving and new-knowledge adaptation. With a limited embedding capacity of
feature representation, the unified feature space often makes the learner
suffer from semantic drift or overfitting as the number of tasks increases.
With this motivation, we propose a novel few-shot class-incremental learning
pipeline based on a composite representation space, which makes old-knowledge
preserving and new-knowledge adaptation mutually compatible by feature space
composition (enlarging the embedding capacity). The composite representation
space is generated by integrating two space components (i.e. stable base
knowledge space and dynamic lifelong-learning knowledge space) in terms of
distance metric construction. With the composite feature space, our method
performs remarkably well on the CUB200 and CIFAR100 datasets, outperforming the
state-of-the-art algorithms by 10.58% and 14.65% respectively.
In text/plain
format
Archived Files and Locations
application/pdf 1.7 MB
file_gevl6ngxbfauhpwr3lmkv3ank4
|
arxiv.org (repository) web.archive.org (webarchive) |
2006.15524v1
access all versions, variants, and formats of this works (eg, pre-prints)