L1 Regularized Least Squares for Support Recovery of High Dimensional Single Index Models with Gaussian Designs. (arXiv:1511.08102v1 [math.ST])6h
[/url][url=]
由 Matey Neykov, Jun S. Liu, Tianxi Cai[url=][/url] 通过 Statistics authors/titles recent submissions[url=][/url]
It is known that for a certain class of single index models (SIM) $Y = f(\boldsymbol{X}^{\intercal}\boldsymbol{\beta}, \varepsilon)$, support recovery is impossible when $\boldsymbol{X} \sim N(0, \mathbb{I}_p)$ and the rescaled sample size $\frac{n}{s \log(p-s)}$ is below a critical threshold. Recently, optimal algorithms based on Sliced Inverse Regression (SIR) were suggested. These algorithms work provably under the assumption that the design matrix $\boldsymbol{X}$ comes from an i.i.d. Gaussian distribution. In the present paper we analyze algorithms based on covariance screening and least squares with $L_1$ penalization (i.e. LASSO) and demonstrate that they can also enjoy optimal (up to a scalar) rescaled sample size in terms of support recovery, albeit under slightly different assumptions on $f$ and $\varepsilon$ compared to the SIR based algorithms. Furthermore, we show more generally, that LASSO succeeds in recovering the signed support of $\boldsymbol{\beta}$ if $\boldsymbol{X} \sim N(0, \boldsymbol{\Sigma})$, and the covariance $\boldsymbol{\Sigma}$ satisfies the irrepresentable condition. Our work extends existing results on the support recovery of LASSO for the linear model, to a certain class of SIM.


雷达卡



京公网安备 11010802022788号







