• Classified by Topic • Classified by Publication Type • Sorted by Date • Sorted by First Author Last Name • Classified by Funding Source •
A Domain-Agnostic Approach for Characterization of Lifelong Learning Systems.
Megan M. Baker, Alexander New, Mario
Aguilar-Simon, Ziad Al-Halah, Sébastien M. R. Arnold, Ese Ben-Iwhiwhu, Andrew P. Brna, Ethan Brooks, Ryan C. Brown,
Zachary Daniels, Anurag Daram, Fabien Delattre, Ryan Dellana, Eric Eaton,
Haotian Fu, Kristen Grauman, Jesse Hostetler, Shariq Iqbal, Cassandra Kent, Nicholas Ketz, Soheil Kolouri, George Konidaris,
Dhireesha Kudithipudi, Erik Learned-Miller, Seungwon Lee,
Michael L. Littman, Sandeep Madireddy, Jorge A. Mendez, Eric Q. Nguyen, Christine
D. Piatko, Praveen K. Pilly, Aswin Raghavan, Abrar Rahman, Santhosh Kumar Ramakrishnan,
Neale Ratzlaff, Andrea Soltoggio, Peter Stone, Indranil Sur, Zhipeng Tang,
Saket Tiwari, Kyle Vedder, Felix Wang, Zifan Xu, Angel Yanguas-Gil, Harel
Yedidsion, Shangqun Yu, and Gautam K. Vallabha.
Neural Networks, pp. 274–96, March 2023.
Available
from https://arxiv.org/abs/2301.07799
Official version on publisher's website
(unavailable)
Despite the advancement of machine learning techniques in recent years, state-of-the-art systems lack robustness to "real world" events, where the input distributions and tasks encountered by the deployed systems will not be limited to the original training context, and systems will instead need to adapt to novel distributions and tasks while deployed. This critical gap may be addressed through the development of "Lifelong Learning" systems that are capable of 1) Continuous Learning, 2) Transfer and Adaptation, and 3) Scalability. Unfortunately, efforts to improve these capabilities are typically treated as distinct areas of research that are assessed independently, without regard to the impact of each separate capability on other aspects of the system. We instead propose a holistic approach, using a suite of metrics and an evaluation framework to assess Lifelong Learning in a principled way that is agnostic to specific domains or system techniques. Through five case studies, we show that this suite of metrics can inform the development of varied and complex Lifelong Learning systems. We highlight how the proposed suite of metrics quantifies performance trade-offs present during Lifelong Learning system development - both the widely discussed Stability-Plasticity dilemma and the newly proposed relationship between Sample Efficient and Robust Learning. Further, we make recommendations for the formulation and use of metrics to guide the continuing development of Lifelong Learning systems and assess their progress in the future.
@article{NN23-L2M, title="A Domain-Agnostic Approach for Characterization of Lifelong Learning Systems", author="Megan M. Baker and Alexander New and Mario Aguilar-Simon and Ziad Al-Halah and S\'ebastien M. R. Arnold and Ese Ben-Iwhiwhu and Andrew P. Brna and Ethan Brooks and Ryan C. Brown and Zachary Daniels and Anurag Daram and Fabien Delattre and Ryan Dellana and Eric Eaton and Haotian Fu and Kristen Grauman and Jesse Hostetler and Shariq Iqbal and Cassandra Kent and Nicholas Ketz and Soheil Kolouri and George Konidaris and Dhireesha Kudithipudi, Erik Learned-Miller and Seungwon Lee and Michael L. Littman and Sandeep Madireddy, Jorge A. Mendez and Eric Q. Nguyen and Christine D. Piatko and Praveen K. Pilly and Aswin Raghavan and Abrar Rahman and Santhosh Kumar Ramakrishnan and Neale Ratzlaff and Andrea Soltoggio and Peter Stone and Indranil Sur and Zhipeng Tang and Saket Tiwari and Kyle Vedder and Felix Wang and Zifan Xu and Angel Yanguas-Gil and Harel Yedidsion and Shangqun Yu and Gautam K. Vallabha", journal="Neural Networks", year="2023", month="March", pages="274--96", abstract={Despite the advancement of machine learning techniques in recent years, state-of-the-art systems lack robustness to "real world" events, where the input distributions and tasks encountered by the deployed systems will not be limited to the original training context, and systems will instead need to adapt to novel distributions and tasks while deployed. This critical gap may be addressed through the development of "Lifelong Learning" systems that are capable of 1) Continuous Learning, 2) Transfer and Adaptation, and 3) Scalability. Unfortunately, efforts to improve these capabilities are typically treated as distinct areas of research that are assessed independently, without regard to the impact of each separate capability on other aspects of the system. We instead propose a holistic approach, using a suite of metrics and an evaluation framework to assess Lifelong Learning in a principled way that is agnostic to specific domains or system techniques. Through five case studies, we show that this suite of metrics can inform the development of varied and complex Lifelong Learning systems. We highlight how the proposed suite of metrics quantifies performance trade-offs present during Lifelong Learning system development - both the widely discussed Stability-Plasticity dilemma and the newly proposed relationship between Sample Efficient and Robust Learning. Further, we make recommendations for the formulation and use of metrics to guide the continuing development of Lifelong Learning systems and assess their progress in the future.}, wwwnote={Available from <a href="https://arxiv.org/abs/2301.07799">https://arxiv.org/abs/2301.07799</a><br> <a href="https://www.sciencedirect.com/science/article/pii/S0893608023000072?dgcid=coauthor" target="_blank">Official version</a> on publisher's website}, }
Generated by bib2html.pl (written by Patrick Riley ) on Wed Jan 15, 2025 08:40:49