STEIN UNBIASED RISK ESTIMATE AS A MODEL SELECTION ESTIMATOR
Nihad Nouri1*, Fatiha Mezoued2
1Phd student, Dept. Applied Statistics, National Higher School of Statistics and Applied Economy, Algeria, firstname.lastname@example.org
2Prof., Dept. Applied Statistics, National Higher School of Statistics and Applied Economy, Algeria, email@example.com
To restore a low-rank structure from a noisy matrix, many recent authors has used and studied truncated singular value decomposition. So thus, according to these studies, the image can be better estimated by shrinking the singular values as well. This paper is concerned with additive models of the form Y = M +E, where Y is an observed n×m matrix with m < n, M is an unknown n×m matrix of interest with low rank, and E is a random noise. For a family of estimators of M which is obtained from shrinkage functions ϕλ(σi) based on the singular values decomposition of the matrix Y, we are interested in the performance of the model proposed by Candès et al (2012) for other thresholding function (Minimax Concave Penalty (MCP)), and under the assumption that the distribution of data matrix Y is multivariate t-Student distribution that belongs to an elliptically distribution family which extends the Gaussian case. Under this distributional context, we propose to apply stein unbiased risk estimate (SURE) improved by S. Canu and D. Fourdrinier (2017), in order to select the best thresholding function between Minimax Concave Penalty (MCP) and Soft-thersholding, and also to find the optimal shrinking parameter λ from the data Y. Numerical results reveal that the risk estimate SURE is good, the minima are reached for the same lambda λ (λ∗ = = 5218.4) and the diﬀerence between the estimated (SURE) and the usual (Mean Square Error (MSE)) risks is low, and that the risk of MCP is lower than SOFT.
Keywords: stein’s unbiased risk estimate, mean square error, elliptical distribution, singular value decomposition, minimax concave penalty, soft-thresholding.
|FULL TEXT PDF|