Skip to Main Content
431
Views
1
CrossRef citations to date
Altmetric
Pages 11-22
Received 01 Sep 2016
Accepted author version posted online: 21 May 2018
Published online: 20 Aug 2018
 
Translator disclaimer

ABSTRACT

We propose a penalized likelihood method to fit the linear discriminant analysis model when the predictor is matrix valued. We simultaneously estimate the means and the precision matrix, which we assume has a Kronecker product decomposition. Our penalties encourage pairs of response category mean matrix estimators to have equal entries and also encourage zeros in the precision matrix estimator. To compute our estimators, we use a blockwise coordinate descent algorithm. To update the optimization variables corresponding to response category mean matrices, we use an alternating minimization algorithm that takes advantage of the Kronecker structure of the precision matrix. We show that our method can outperform relevant competitors in classification, even when our modeling assumptions are violated. We analyze three real datasets to demonstrate our method’s applicability. Supplementary materials, including an R package implementing our method, are available online.

Acknowledgments

The authors thank the associate editor and referees for helpful comments.

Supplementary Materials

  • Appendix: Includes simulations comparing our method to the methods proposed by Zhong and Suslick (2015 Zhong, W., and Suslick, K. S. (2015), “Matrix Discriminant Analysis With Application to Colorimetric Sensor Array Data,” Technometrics, 57, 524534.[Taylor & Francis Online], [Web of Science ®] [Google Scholar]) and vector-valued sparse linear discriminant methods; simulations illustrating the efficiency gained by joint estimation of the μ*j’s, Δ*, and Φ* using (3); and simulations investigating the sensitivity of (3) to the choice of weights.

  • Code: Includes R scripts to create the real datasets we analyze in Section 5, and to reproduce the simulation results.

  • MatrixLDA: An R package implementing our method, along with auxiliary functions for prediction and tuning parameter selection.

Additional information

Funding

This research was supported in part by the Doctoral Dissertation Fellowship from the University of Minnesota and the National Science Foundation grant DMS-1452068.

Login options

Purchase * Save for later
Online

Article Purchase 24 hours to view or download: USD 51.00 Add to cart

Issue Purchase 30 days to view or download: USD 141.00 Add to cart

* Local tax will be added as applicable