Replay strategies are Continual Learning techniques which mitigate catastrophic forgetting by keeping a buffer of patterns from previous experiences, which are interleaved with new data during training. The amount of patterns stored in the buffer is a critical parameter which largely influences the final performance and the memory footprint of the approach. This work introduces Distilled Replay, a novel replay strategy for Continual Learning which is able to mitigate forgetting by keeping a very small buffer (1 pattern per class) of highly informative samples. Distilled Replay builds the buffer through a distillation process which compresses a large dataset into a tiny set of informative examples. We show the effectiveness of our Distilled Replay against popular replay-based strategies on four Continual Learning benchmarks.

Rosasco, A.; Carta, A.; Cossu, A.; Lomonaco, Vincenzo; Bacciu, D.. (2022). Distilled Replay: Overcoming Forgetting Through Synthetic Samples. In Continual Semi-Supervised Learning First International Workshop, CSSL 2021, Virtual Event, August 19–20, 2021 (pp. 104- 117). Springer. Isbn: 978-3-031-17586-2. Doi: 10.1007/978-3-031-17587-9_8.

Distilled Replay: Overcoming Forgetting Through Synthetic Samples

Lomonaco V.;
2022

Abstract

Replay strategies are Continual Learning techniques which mitigate catastrophic forgetting by keeping a buffer of patterns from previous experiences, which are interleaved with new data during training. The amount of patterns stored in the buffer is a critical parameter which largely influences the final performance and the memory footprint of the approach. This work introduces Distilled Replay, a novel replay strategy for Continual Learning which is able to mitigate forgetting by keeping a very small buffer (1 pattern per class) of highly informative samples. Distilled Replay builds the buffer through a distillation process which compresses a large dataset into a tiny set of informative examples. We show the effectiveness of our Distilled Replay against popular replay-based strategies on four Continual Learning benchmarks.
2022
978-3-031-17586-2
Continual learning, Deep learning, Distillation
Rosasco, A.; Carta, A.; Cossu, A.; Lomonaco, Vincenzo; Bacciu, D.. (2022). Distilled Replay: Overcoming Forgetting Through Synthetic Samples. In Continual Semi-Supervised Learning First International Workshop, CSSL 2021, Virtual Event, August 19–20, 2021 (pp. 104- 117). Springer. Isbn: 978-3-031-17586-2. Doi: 10.1007/978-3-031-17587-9_8.
File in questo prodotto:
File Dimensione Formato  
978-3-031-17587-9_8.pdf

Solo gestori archivio

Tipologia: Versione dell'editore
Licenza: Tutti i diritti riservati
Dimensione 807.7 kB
Formato Adobe PDF
807.7 kB Adobe PDF   Visualizza/Apri
Pubblicazioni consigliate

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11385/253583
Citazioni
  • Scopus 27
  • ???jsp.display-item.citation.isi??? 17
  • OpenAlex ND
social impact