๐Ÿ“š
Course Materials
All the course material in one page!
01_intro_and_motivation.pdf
4MB
PDF
Slides PDF
Classic readings on catastrophic forgetting
โ€‹Catastrophic Forgetting; Catastrophic Interference; Stability; Plasticity; Rehearsal. by and Anthony Robins. Connection Science, 123--146, 1995.
โ€‹Using Semi-Distributed Representations to Overcome Catastrophic Forgetting in Connectionist Networks by and Robert French. In Proceedings of the 13th Annual Cognitive Science Society Conference, 173--178, 1991. [sparsity]
Check out additional material for popular reviews and surveys on continual learning.
โ€‹Lifelong Machine Learning, Second Edition. by Zhiyuan Chen and Bing Liu. Synthesis Lectures on Artificial Intelligence and Machine Learning, 2018.
02_forgetting.pdf
2MB
PDF
Slides PDF
Classic references on Catastrophic Forgetting provided above.
โ€‹Does Continual Learning = Catastrophic Forgetting?, by A. Thai, S. Stojanov, I. Rehg, and J. M. Rehg, arXiv, 2021.
โ€‹An Empirical Study of Example Forgetting during Deep Neural Network Learning, by M. Toneva, A. Sordoni, R. T. des Combes, A. Trischler, Y. Bengio, and G. J. Gordon, ICLR, 2019.
โ€‹Compete to Compute, by R. K. Srivastava, J. Masci, S. Kazerounian, F. Gomez, and J. Schmidhuber, NIPS, 2013 (Permuted MNIST task).
03_benchmarks.pdf
2MB
PDF
Slides PDF
CL scenarios
โ€‹Three scenarios for continual learning, by G. M. van de Ven and A. S. Tolias, Continual Learning workshop at NeurIPS, 2018. task/domain/class incremental learning
โ€‹Continuous Learning in Single-Incremental-Task Scenarios, by D. Maltoni and V. Lomonaco, Neural Networks, vol. 116, pp. 56โ€“73, 2019. New Classes (NC), New Instances (NI), New Instances and Classes (NIC) + Single Incremental (SIT) /Multi (MT) /Multi Incremental (MIT) Task
โ€‹Task-Free Continual Learning, by R. Aljundi, K. Kelchtermans, and T. Tuytelaars, CVPR, 2019.
โ€‹Continual Prototype Evolution: Learning Online from Non-Stationary Data Streams, by M. De Lange and T. Tuytelaars, ICCV, 2021. Data-incremental and comparisons with other CL scenarios
Survey presenting CL scenarios
โ€‹Continual Learning for Robotics: Definition, Framework, Learning Strategies, Opportunities and Challenges by Timothรฉe Lesort, Vincenzo Lomonaco, Andrei Stoian, Davide Maltoni, David Filliat and Natalia Dรญaz-Rodr\วตuez. Information Fusion, 52--68, 2020. Section 3, in particular.
CL benchmarks
โ€‹CORe50: a New Dataset and Benchmark for Continuous Object Recognition, by V. Lomonaco and D. Maltoni, Proceedings of the 1st Annual Conference on Robot Learning, vol. 78, pp. 17โ€“26, 2017.
โ€‹Incremental Object Learning From Contiguous Views, by S. Stojanov et al., CVPR, 2019. CRIB benchmark
โ€‹Stream-51: Streaming Classification and Novelty Detection From Videos, by R. Roady, T. L. Hayes, H. Vaidya, and C. Kanan, CVPR 2019.

โ€‹Evaluation & Metricsโ€‹

04_evaluation.pdf
2MB
PDF
Slides PDF
โ€‹Efficient Lifelong Learning with A-GEM, by A. Chaudhry, M. Ranzato, M. Rohrbach, and M. Elhoseiny, ICLR, 2019. Evaluation protocol with "split by experiences".
โ€‹Gradient Episodic Memory for Continual Learning, by D. Lopez-Paz and M. Ranzato, NIPS, 2017. popular formalization of ACC, BWT, FWT.
โ€‹Donโ€™t forget, there is more than forgetting: new metrics for Continual Learning, by N. Dรญaz-Rodrรญguez, V. Lomonaco, D. Filliat, and D. Maltoni, arXiv, 2018. definition of additional metrics
05_methodologies_part1.pdf
2MB
PDF
Slides PDF
Replay
โ€‹GDumb: A Simple Approach that Questions Our Progress in Continual Learning, by A. Prabhu, P. H. S. Torr, and P. K. Dokania, ECCV, 2020.
โ€‹Online Continual Learning with Maximal Interfered Retrieval, by R. Aljundi et al., NeurIPS, 2019.
Latent replay
โ€‹Latent Replay for Real-Time Continual Learning, by Lorenzo Pellegrini, Gabriele Graffieti, Vincenzo Lomonaco, Davide Maltoni, IROS, 2020.
Generative replay
โ€‹Continual Learning with Deep Generative Replay, by H. Shin, J. K. Lee, J. Kim, and J. Kim, NeurIPS, 2017.
โ€‹Brain-inspired replay for continual learning with artificial neural networks, by G. M. van de Ven, H. T. Siegelmann, and A. S. Tolias, Nature Communications, 2020
06_methodologies_part2.pdf
2MB
PDF
Slides PDF
L1, L2, Dropout
โ€‹Understanding the Role of Training Regimes in Continual Learning, by Mirzadeh et al., NeurIPS, 2020.
Regularization strategies
โ€‹Learning without Forgetting, by Li et al., TPAMI 2017.
โ€‹Overcoming catastrophic forgetting in neural networks, by Kirkpatrick et al, PNAS 2017.
โ€‹Continual learning with hypernetworks, by Von Osvald et al., ICLR 2020.
Architectural strategies
โ€‹Rehearsal-Free Continual Learning over Small Non-I.I.D. Batches, by Lomonaco et al, CLVision Workshop at CVPR 2020. CWR*
โ€‹Progressive Neural Networks, by Rusu et al., arXiv, 2016.
โ€‹Supermasks in Superposition, by Wortsman et al., NeurIPS, 2020.
07_methodologies_part3.pdf
3MB
PDF
Slides PDF
Hybrid strategies
โ€‹Gradient Episodic Memory for Continual Learning, by Lopez-Paz et al, NeurIPS 2017 GEM.
โ€‹iCaRL: Incremental Classifier and Representation Learning, by Rebuffi et al, CVPR, 2017.
โ€‹Latent Replay for Real-Time Continual Learning, by L. Pellegrini et al., IROS 2020 AR1*.
Applications
โ€‹Continual Learning in Practice by T. Diethe et al., Continual Learning Workshop at NeurIPS, 2018.
Startups / Companies: CogitAI, Neurala, Gantryโ€‹
Tools / Libraries: Avalanche, Continuum, Sequoia, CL-Gymโ€‹
08_frontiers.pdf
4MB
PDF
Slides PDF
โ€‹Embracing Change: Continual Learning in Deep Neural Networks, by Hadsell et al., Trends in Cognitive Science, 2020. Continual meta learning - Meta continual learning
โ€‹Continual Unsupervised Representation Learning, by D. Rao et al., NeurIPS 2019.
Distributed Continual Learning Ex-Model: Continual Learning from a Stream of Trained Models, by Carta et al., arXiv, 2021.
Continual Sequence Learning Continual learning for recurrent neural networks: An empirical evaluation, by Cossu et al, Neural Networks, vol. 143, pp. 607โ€“627, 2021. Continual Learning with Echo State Networks, by Cossu et al., ESANN, 2021.

โ€‹Invited Lecturesโ€‹

Software

โ€‹Avalanche: an End-to-End Library for Continual Learning, the software library based on PyTorch used for the coding session of this course.
โ€‹ContinualAI Colab notebooks, coding continual learning from scratch in notebooks