Real data is taking on more and more complex structures, raising the necessity for more flexible and parsimonious statistical methodologies. Tensor-variate (or multi-way) structures are a typical example of such kind of data. Unfortunately, real data often present atypical observations that make the traditional normality assumption inadequate. Thus, in this paper, we first introduce two new tensor-variate distributions, both heavy-tailed generalizations of the tensor-variate normal distribution. Then, we use these distributions for model-based clustering via finite mixture models. To introduce parsimony in the models, we use the eigen-decomposition of the components’ scale matrices, obtaining two families of parsimonious tensor-variate mixture models. As a by-product, we also introduce the parsimonious version of tensor-variate normal mixtures. As for parameter estimation, we illustrate variants of the well-known EM algorithm. Since the number of parsimonious models depends on the order of the tensors, we implement strategies intending to shorten the initialization and fitting processes. These procedures are investigated via simulated analyses. Finally, we fitted our parsimonious models to two real datasets having a 4-way and a 5-way structure, respectively.

Tomarchio, S. D., Punzo, A., Bagnato, L., Parsimonious mixtures for the analysis of tensor-variate data, <<STATISTICS AND COMPUTING>>, 2023; 33 (6): 1-27. [doi:10.1007/s11222-023-10291-7] [https://hdl.handle.net/10807/252234]

Parsimonious mixtures for the analysis of tensor-variate data

Bagnato, Luca
2023

Abstract

Real data is taking on more and more complex structures, raising the necessity for more flexible and parsimonious statistical methodologies. Tensor-variate (or multi-way) structures are a typical example of such kind of data. Unfortunately, real data often present atypical observations that make the traditional normality assumption inadequate. Thus, in this paper, we first introduce two new tensor-variate distributions, both heavy-tailed generalizations of the tensor-variate normal distribution. Then, we use these distributions for model-based clustering via finite mixture models. To introduce parsimony in the models, we use the eigen-decomposition of the components’ scale matrices, obtaining two families of parsimonious tensor-variate mixture models. As a by-product, we also introduce the parsimonious version of tensor-variate normal mixtures. As for parameter estimation, we illustrate variants of the well-known EM algorithm. Since the number of parsimonious models depends on the order of the tensors, we implement strategies intending to shorten the initialization and fitting processes. These procedures are investigated via simulated analyses. Finally, we fitted our parsimonious models to two real datasets having a 4-way and a 5-way structure, respectively.
2023
Inglese
Tomarchio, S. D., Punzo, A., Bagnato, L., Parsimonious mixtures for the analysis of tensor-variate data, <<STATISTICS AND COMPUTING>>, 2023; 33 (6): 1-27. [doi:10.1007/s11222-023-10291-7] [https://hdl.handle.net/10807/252234]
File in questo prodotto:
Non ci sono file associati a questo prodotto.

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/10807/252234
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 0
  • ???jsp.display-item.citation.isi??? ND
social impact