Continual Learning Course
ContinualAIWikiAvalanche Mailing-list
  • Continual Learning: On Machines that can Learn Continually
  • Background
    • 🔡Prerequisites
    • 🛠️Tools & Setup
    • 📑Course Details
  • Lectures
    • 📍Introduction & Motivation
    • 📍Understanding Catastrophic Forgetting
    • 📍Scenarios & Benchmarks
    • 📍Evaluation & Metrics
    • 📍Methodologies [Part 1]
    • 📍Methodologies [Part 2]
    • 📍Methodologies [Part 3], Applications & Tools
    • 📍Frontiers in Continual Learning
  • Invited & Extra Lectures
    • 💻Avalanche Dev Day
    • 🔮Invited Talks
  • Resources
    • 📚Course Materials
    • 🔀Additional Material
  • About Us
    • 👨‍🏫Your Instructor
    • 🆘Teaching Assistants
  • Useful Links
    • Avalanche
    • Forum
    • Colab
    • Wiki
    • Open World Lifelong Learning Course
    • ContinualAI
    • Join us on Slack!
Powered by GitBook
On this page

Was this helpful?

Export as PDF
  1. Lectures

Understanding Catastrophic Forgetting

The Biggest Obstacle for Continual Learning Machines

PreviousIntroduction & MotivationNextScenarios & Benchmarks

Last updated 3 years ago

Was this helpful?

In this lecture we will address the following points:

  • What is catastrophic forgetting?

  • Understanding forgetting with one neuron

  • A deep learning example: Permuted and Split MNIST

  • Brainstorming session: how to solve forgetting?

  • Avalanche: an end-to-end library for continual learning research

📍
Lecture #2: Understanding Catastrophic Forgetting - Video Recording
Lecture #2: Understanding Catastrophic Forgetting - Slides
2MB
02_forgetting.pdf
pdf
Slides PDF