Skip to Main Content
2,404
Views
85
CrossRef citations to date
Altmetric
Pages 1533-1545
Received 01 Apr 2011
Accepted author version posted online: 08 Oct 2012
Published online:21 Dec 2012
 
Translator disclaimer

The reduced-rank regression is an effective method in predicting multiple response variables from the same set of predictor variables. It reduces the number of model parameters and takes advantage of interrelations between the response variables and hence improves predictive accuracy. We propose to select relevant variables for reduced-rank regression by using a sparsity-inducing penalty. We apply a group-lasso type penalty that treats each row of the matrix of the regression coefficients as a group and show that this penalty satisfies certain desirable invariance properties. We develop two numerical algorithms to solve the penalized regression problem and establish the asymptotic consistency of the proposed method. In particular, the manifold structure of the reduced-rank regression coefficient matrix is considered and studied in our theoretical analysis. In our simulation study and real data analysis, the new method is compared with several existing variable selection methods for multivariate regression and exhibits competitive performance in prediction and variable selection.

Acknowledgments

Huang’s work was partially supported by grants from the National Science Foundation (NSF; DMS-0907170, DMS-1007618, DMS-1208952) and Award Number KUS-CI-016-04, made by King Abdullah University of Science and Technology (KAUST). The authors thank Joseph Chang, Dean Foster, Zhihua Qiao, Lyle Ungar, and Lan Zhou for helpful discussions. They also thank two anonymous reviewers, an associate editor and the co-editor Jun Liu for constructive comments.

Login options

Purchase * Save for later
Online

Article Purchase 24 hours to view or download: USD 44.00 Add to cart

Issue Purchase 30 days to view or download: USD 268.00 Add to cart

* Local tax will be added as applicable