Learning continually from non-stationary data streams is a challenging research topic of growing popularity in the last few years. Being able to learn, adapt, and generalize continually in an efficient, effective, and scalable way is fundamental for a sustainable development of Artificial Intelligent systems. However, an agent-centric view of continual learning requires learning directly from raw data, which limits the interaction between independent agents, the efficiency, and the privacy of current approaches. Instead, we argue that continual learning systems should exploit the availability of compressed information in the form of trained models. In this paper, we introduce and formalize a new paradigm named "Ex-Model Continual Learning" (ExML), where an agent learns from a sequence of previously trained models in-stead of raw data. We further contribute with three ex-model continual learning algorithms and an empirical setting comprising three datasets (MNIST, CIFAR-10 and CORe50), and eight scenarios, where the proposed algorithms are extensively tested. Finally, we highlight the peculiarities of the ex-model paradigm and we point out interesting future research directions.

Carta, A.; Cossu, A.; Lomonaco, Vincenzo; Bacciu, D.. (2022). Ex-Model: Continual Learning from a Stream of Trained Models. In 2022 IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops (CVPRW) (pp. 3789- 3798). Isbn: 978-1-6654-8739-9. Doi: 10.1109/CVPRW56347.2022.00424.

Ex-Model: Continual Learning from a Stream of Trained Models

Lomonaco V.;
2022

Abstract

Learning continually from non-stationary data streams is a challenging research topic of growing popularity in the last few years. Being able to learn, adapt, and generalize continually in an efficient, effective, and scalable way is fundamental for a sustainable development of Artificial Intelligent systems. However, an agent-centric view of continual learning requires learning directly from raw data, which limits the interaction between independent agents, the efficiency, and the privacy of current approaches. Instead, we argue that continual learning systems should exploit the availability of compressed information in the form of trained models. In this paper, we introduce and formalize a new paradigm named "Ex-Model Continual Learning" (ExML), where an agent learns from a sequence of previously trained models in-stead of raw data. We further contribute with three ex-model continual learning algorithms and an empirical setting comprising three datasets (MNIST, CIFAR-10 and CORe50), and eight scenarios, where the proposed algorithms are extensively tested. Finally, we highlight the peculiarities of the ex-model paradigm and we point out interesting future research directions.
2022
978-1-6654-8739-9
Learning systems, Data privacy, Computer vision, Conferences, Computational modeling, Data models, Pattern recognition
Carta, A.; Cossu, A.; Lomonaco, Vincenzo; Bacciu, D.. (2022). Ex-Model: Continual Learning from a Stream of Trained Models. In 2022 IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops (CVPRW) (pp. 3789- 3798). Isbn: 978-1-6654-8739-9. Doi: 10.1109/CVPRW56347.2022.00424.
File in questo prodotto:
File Dimensione Formato  
Ex-Model_Continual_Learning_from_a_Stream_of_Trained_Models.pdf

Solo gestori archivio

Tipologia: Versione dell'editore
Licenza: Tutti i diritti riservati
Dimensione 1.42 MB
Formato Adobe PDF
1.42 MB Adobe PDF   Visualizza/Apri
Pubblicazioni consigliate

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11385/253578
Citazioni
  • Scopus 8
  • ???jsp.display-item.citation.isi??? 5
  • OpenAlex ND
social impact