Links
📚

Course Materials

All the course material in one page!
01_intro_and_motivation.pdf
4MB
PDF
Slides PDF
Classic readings on catastrophic forgetting
Catastrophic Forgetting; Catastrophic Interference; Stability; Plasticity; Rehearsal. by and Anthony Robins. Connection Science, 123--146, 1995.
Using Semi-Distributed Representations to Overcome Catastrophic Forgetting in Connectionist Networks by and Robert French. In Proceedings of the 13th Annual Cognitive Science Society Conference, 173--178, 1991. [sparsity]
Check out additional material for popular reviews and surveys on continual learning.
Lifelong Machine Learning, Second Edition. by Zhiyuan Chen and Bing Liu. Synthesis Lectures on Artificial Intelligence and Machine Learning, 2018.
02_forgetting.pdf
2MB
PDF
Slides PDF
Classic references on Catastrophic Forgetting provided above.
Does Continual Learning = Catastrophic Forgetting?, by A. Thai, S. Stojanov, I. Rehg, and J. M. Rehg, arXiv, 2021.
An Empirical Study of Example Forgetting during Deep Neural Network Learning, by M. Toneva, A. Sordoni, R. T. des Combes, A. Trischler, Y. Bengio, and G. J. Gordon, ICLR, 2019.
Compete to Compute, by R. K. Srivastava, J. Masci, S. Kazerounian, F. Gomez, and J. Schmidhuber, NIPS, 2013 (Permuted MNIST task).
03_benchmarks.pdf
2MB
PDF
Slides PDF
CL scenarios
Three scenarios for continual learning, by G. M. van de Ven and A. S. Tolias, Continual Learning workshop at NeurIPS, 2018. task/domain/class incremental learning
Continuous Learning in Single-Incremental-Task Scenarios, by D. Maltoni and V. Lomonaco, Neural Networks, vol. 116, pp. 56–73, 2019. New Classes (NC), New Instances (NI), New Instances and Classes (NIC) + Single Incremental (SIT) /Multi (MT) /Multi Incremental (MIT) Task
Task-Free Continual Learning, by R. Aljundi, K. Kelchtermans, and T. Tuytelaars, CVPR, 2019.
Continual Prototype Evolution: Learning Online from Non-Stationary Data Streams, by M. De Lange and T. Tuytelaars, ICCV, 2021. Data-incremental and comparisons with other CL scenarios
Survey presenting CL scenarios
Continual Learning for Robotics: Definition, Framework, Learning Strategies, Opportunities and Challenges by Timothée Lesort, Vincenzo Lomonaco, Andrei Stoian, Davide Maltoni, David Filliat and Natalia Díaz-Rodr\ǵuez. Information Fusion, 52--68, 2020. Section 3, in particular.
CL benchmarks
CORe50: a New Dataset and Benchmark for Continuous Object Recognition, by V. Lomonaco and D. Maltoni, Proceedings of the 1st Annual Conference on Robot Learning, vol. 78, pp. 17–26, 2017.
Incremental Object Learning From Contiguous Views, by S. Stojanov et al., CVPR, 2019. CRIB benchmark
Stream-51: Streaming Classification and Novelty Detection From Videos, by R. Roady, T. L. Hayes, H. Vaidya, and C. Kanan, CVPR 2019.
04_evaluation.pdf
2MB
PDF
Slides PDF
Efficient Lifelong Learning with A-GEM, by A. Chaudhry, M. Ranzato, M. Rohrbach, and M. Elhoseiny, ICLR, 2019. Evaluation protocol with "split by experiences".
Gradient Episodic Memory for Continual Learning, by D. Lopez-Paz and M. Ranzato, NIPS, 2017. popular formalization of ACC, BWT, FWT.
Don’t forget, there is more than forgetting: new metrics for Continual Learning, by N. Díaz-Rodríguez, V. Lomonaco, D. Filliat, and D. Maltoni, arXiv, 2018. definition of additional metrics
05_methodologies_part1.pdf
2MB
PDF
Slides PDF
Replay
GDumb: A Simple Approach that Questions Our Progress in Continual Learning, by A. Prabhu, P. H. S. Torr, and P. K. Dokania, ECCV, 2020.
Online Continual Learning with Maximal Interfered Retrieval, by R. Aljundi et al., NeurIPS, 2019.
Latent replay
Latent Replay for Real-Time Continual Learning, by Lorenzo Pellegrini, Gabriele Graffieti, Vincenzo Lomonaco, Davide Maltoni, IROS, 2020.
Generative replay
Continual Learning with Deep Generative Replay, by H. Shin, J. K. Lee, J. Kim, and J. Kim, NeurIPS, 2017.
Brain-inspired replay for continual learning with artificial neural networks, by G. M. van de Ven, H. T. Siegelmann, and A. S. Tolias, Nature Communications, 2020
06_methodologies_part2.pdf
2MB
PDF
Slides PDF
L1, L2, Dropout
Regularization strategies
Learning without Forgetting, by Li et al., TPAMI 2017.
Overcoming catastrophic forgetting in neural networks, by Kirkpatrick et al, PNAS 2017.
Continual learning with hypernetworks, by Von Osvald et al., ICLR 2020.
Architectural strategies
Rehearsal-Free Continual Learning over Small Non-I.I.D. Batches, by Lomonaco et al, CLVision Workshop at CVPR 2020. CWR*
Progressive Neural Networks, by Rusu et al., arXiv, 2016.
Supermasks in Superposition, by Wortsman et al., NeurIPS, 2020.
07_methodologies_part3.pdf
3MB
PDF
Slides PDF
Hybrid strategies
Gradient Episodic Memory for Continual Learning, by Lopez-Paz et al, NeurIPS 2017 GEM.
Latent Replay for Real-Time Continual Learning, by L. Pellegrini et al., IROS 2020 AR1*.
Applications
Continual Learning in Practice by T. Diethe et al., Continual Learning Workshop at NeurIPS, 2018.
Startups / Companies: CogitAI, Neurala, Gantry
Tools / Libraries: Avalanche, Continuum, Sequoia, CL-Gym
08_frontiers.pdf
4MB
PDF
Slides PDF
Embracing Change: Continual Learning in Deep Neural Networks, by Hadsell et al., Trends in Cognitive Science, 2020. Continual meta learning - Meta continual learning
Continual Unsupervised Representation Learning, by D. Rao et al., NeurIPS 2019.
Distributed Continual Learning Ex-Model: Continual Learning from a Stream of Trained Models, by Carta et al., arXiv, 2021.
Continual Sequence Learning Continual learning for recurrent neural networks: An empirical evaluation, by Cossu et al, Neural Networks, vol. 143, pp. 607–627, 2021. Continual Learning with Echo State Networks, by Cossu et al., ESANN, 2021.

Software

Avalanche: an End-to-End Library for Continual Learning, the software library based on PyTorch used for the coding session of this course.
ContinualAI Colab notebooks, coding continual learning from scratch in notebooks