Arguably one of the most notable forms of the principle of parsimony was formulated by the philosopher and theologian William of Ockham in the 14th century, and later became well known as Ockham’s Razor principle, which can be phrased as: “Entities should not be multiplied without necessity.” This principle is undoubtedly one of the most fundamental ideas that pervade many branches of knowledge, from philosophy to art and science, from ancient times to modern age, then summarized in the expression “Make everything as simple as possible, but not simpler” as likewise asserted by Albert Einstein. The sparse modeling is an evident manifestation capturing the parsimony principle just described, and sparse models are widespread in statistics, physics, information sciences, neuroscience, computational mathematics, and so on. In statistics the many applications of sparse modeling span regression, classification tasks, graphical model selection, sparse M-estimators, and sparse dimensionality reduction. It is also particularly effective in many statistical and machine learning areas where the primary goal is to discover predictive patterns from data, which would enhance our understanding and control of underlying physical, biological, and other natural processes, beyond just building accurate outcome black-box predictors. Common examples include selecting biomarkers in biological procedures, finding relevant brain activity locations, which are predictive about brain states and processes based on fMRI data, and identifying network bottlenecks best explaining end-to-end performance. Moreover, the research and applications of efficient recovery of high-dimensional sparse signals from a relatively small number of observations, which is the main focus of compressed sensing or compressive sensing, have rapidly grown and became an extremely intense area of study beyond classical signal processing. Likewise interestingly, sparse modeling is directly related to various artificial vision tasks, such as image denoising, segmentation, restoration and superresolution, object or face detection and recognition in visual scenes, as well as action recognition and behavior analysis. Sparsity has also been applied in information compression, text classification, and recommendation systems. In this chapter, we provide a brief introduction of the basic theory underlying sparse representation and compressive sensing and then discuss some methods for recovering sparse solutions to optimization problems in an effective way, together with some applications of sparse recovery in a machine learning problem known as sparse dictionary learning.

Lin, J., Sparse models for machine learning, in Kunze, H., La Torre, D., Riccoboni, A., Ruiz Galán, M. (ed.), Engineering Mathematics and Artificial Intelligence: Foundations, Methods, and Applications, CRC Press, Boca Raton 2023: <<Mathematics and its Applications>>, 2023 107- 146. 10.1201/9781003283980-5 [https://hdl.handle.net/10807/247134]

Sparse models for machine learning

Lin, Jianyi
2023

Abstract

Arguably one of the most notable forms of the principle of parsimony was formulated by the philosopher and theologian William of Ockham in the 14th century, and later became well known as Ockham’s Razor principle, which can be phrased as: “Entities should not be multiplied without necessity.” This principle is undoubtedly one of the most fundamental ideas that pervade many branches of knowledge, from philosophy to art and science, from ancient times to modern age, then summarized in the expression “Make everything as simple as possible, but not simpler” as likewise asserted by Albert Einstein. The sparse modeling is an evident manifestation capturing the parsimony principle just described, and sparse models are widespread in statistics, physics, information sciences, neuroscience, computational mathematics, and so on. In statistics the many applications of sparse modeling span regression, classification tasks, graphical model selection, sparse M-estimators, and sparse dimensionality reduction. It is also particularly effective in many statistical and machine learning areas where the primary goal is to discover predictive patterns from data, which would enhance our understanding and control of underlying physical, biological, and other natural processes, beyond just building accurate outcome black-box predictors. Common examples include selecting biomarkers in biological procedures, finding relevant brain activity locations, which are predictive about brain states and processes based on fMRI data, and identifying network bottlenecks best explaining end-to-end performance. Moreover, the research and applications of efficient recovery of high-dimensional sparse signals from a relatively small number of observations, which is the main focus of compressed sensing or compressive sensing, have rapidly grown and became an extremely intense area of study beyond classical signal processing. Likewise interestingly, sparse modeling is directly related to various artificial vision tasks, such as image denoising, segmentation, restoration and superresolution, object or face detection and recognition in visual scenes, as well as action recognition and behavior analysis. Sparsity has also been applied in information compression, text classification, and recommendation systems. In this chapter, we provide a brief introduction of the basic theory underlying sparse representation and compressive sensing and then discuss some methods for recovering sparse solutions to optimization problems in an effective way, together with some applications of sparse recovery in a machine learning problem known as sparse dictionary learning.
2023
Inglese
Engineering Mathematics and Artificial Intelligence: Foundations, Methods, and Applications
9781003283980
CRC Press
2023
Lin, J., Sparse models for machine learning, in Kunze, H., La Torre, D., Riccoboni, A., Ruiz Galán, M. (ed.), Engineering Mathematics and Artificial Intelligence: Foundations, Methods, and Applications, CRC Press, Boca Raton 2023: <<Mathematics and its Applications>>, 2023 107- 146. 10.1201/9781003283980-5 [https://hdl.handle.net/10807/247134]
File in questo prodotto:
Non ci sono file associati a questo prodotto.

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/10807/247134
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 0
  • ???jsp.display-item.citation.isi??? ND
social impact