Lifelong Learning via Progressive Distillation and Retrospection
Saihui Hou, Xinyu Pan, Chen Change Loy, Zilei Wang, Dahua Lin; The European Conference on Computer Vision (ECCV), 2018, pp. 437-452
Abstract
Lifelong learning aims at adapting a learned model to new tasks while retaining the knowledge gained earlier. A key challenge for lifelong learning is how to strike a balance between the preservation on old tasks and the adaptation to a new one within a given model. Approaches that combine both objectives in training have been explored in previous works. Yet the performance still suffers from considerable degradation in a long sequence of tasks. In this work, we propose a novel approach to lifelong learning, which tries to seek a better balance between preservation and adaptation via two techniques: Distillation and Retrospection. Specifically, the target model adapts to the new task by knowledge distillation from an intermediate expert, while the previous knowledge is more effectively preserved by caching a small subset of data for old tasks. The combination of Distillation and Retrospection leads to a more gentle learning curve for the target model, and extensive experiments demonstrate that our approach can bring consistent improvements on both old and new tasks.
Related Material
[pdf] [
bibtex]
@InProceedings{Hou_2018_ECCV,
author = {Hou, Saihui and Pan, Xinyu and Change Loy, Chen and Wang, Zilei and Lin, Dahua},
title = {Lifelong Learning via Progressive Distillation and Retrospection},
booktitle = {The European Conference on Computer Vision (ECCV)},
month = {September},
year = {2018}
}