Strong oracle optimality of folded concave penalized estimation

From MaRDI portal
Publication:2510819

DOI10.1214/13-AOS1198zbMath1305.62252arXiv1210.5992WikidataQ43107615 ScholiaQ43107615MaRDI QIDQ2510819

Lingzhou Xue, Hui Zou, Jianqing Fan

Publication date: 4 August 2014

Published in: The Annals of Statistics (Search for Journal in Brave)

Full work available at URL: https://arxiv.org/abs/1210.5992



Related Items

A convex-Nonconvex strategy for grouped variable selection, High-Dimensional Censored Regression via the Penalized Tobit Likelihood, The nonparametric Box-Cox model for high-dimensional regression analysis, Retire: robust expectile regression in high dimensions, Statistical Learning for Individualized Asset Allocation, Multi-Task Learning with High-Dimensional Noisy Images, Robust High-Dimensional Regression with Coefficient Thresholding and Its Application to Imaging Data Analysis, Projection Test for Mean Vector in High Dimensions, Smoothing accelerated proximal gradient method with fast convergence rate for nonsmooth convex optimization beyond differentiability, Cross-Fitted Residual Regression for High-Dimensional Heteroscedasticity Pursuit, A penalized least product relative error loss function based on wavelet decomposition for non-parametric multiplicative additive models, Regularized Linear Programming Discriminant Rule with Folded Concave Penalty for Ultrahigh-Dimensional Data, Understanding Implicit Regularization in Over-Parameterized Single Index Model, Asset splitting algorithm for ultrahigh dimensional portfolio selection and its theoretical property, A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression, Achieving the oracle property of OEM with nonconvex penalties, Robust estimation and shrinkage in ultrahigh dimensional expectile regression with heavy tails and variance heterogeneity, Global solutions to folded concave penalized nonconvex learning, A scalable surrogate \(L_0\) sparse regression method for generalized linear models with applications to large scale data, On the strong oracle property of concave penalized estimators with infinite penalty derivative at the origin, Distributed optimization and statistical learning for large-scale penalized expectile regression, Estimating finite mixtures of ordinal graphical models, Oracle inequalities, variable selection and uniform inference in high-dimensional correlated random effects panel data models, High-Dimensional Learning Under Approximate Sparsity with Applications to Nonsmooth Estimation and Regularized Neural Networks, A proximal dual semismooth Newton method for zero-norm penalized quantile regression estimator, Are discoveries spurious? Distributions of maximum spurious correlations and their applications, Degrees of freedom for piecewise Lipschitz estimators, Bias versus non-convexity in compressed sensing, Constructing initial estimators in one-step estimation procedures of nonlinear regression, An interior stochastic gradient method for a class of non-Lipschitz optimization problems, Variable selection and parameter estimation with the Atan regularization method, A unifying framework of high-dimensional sparse estimation with difference-of-convex (DC) regularizations, Asymptotic properties of one-step M-estimators, Rates of convergence of the adaptive elastic net and the post-selection procedure in ultra-high dimensional sparse models, Balanced estimation for high-dimensional measurement error models, Robust low transformed multi-rank tensor methods for image alignment, Variable selection and structure identification for varying coefficient Cox models, Estimation and variable selection for partial functional linear regression, Almost sure uniqueness of a global minimum without convexity, Nonconcave penalized estimation in sparse vector autoregression model, Penalized estimation in finite mixture of ultra-high dimensional regression models, Folded concave penalized sparse linear regression: sparsity, statistical performance, and algorithmic theory for local solutions, Alternating direction method of multipliers for nonconvex fused regression problems, Difference-of-Convex Learning: Directional Stationarity, Optimality, and Sparsity, Quantile regression for additive coefficient models in high dimensions, Zero-norm regularized problems: equivalent surrogates, proximal MM method and statistical error bound, Smoothing neural network for \(L_0\) regularized optimization problem with general convex constraints, Simultaneous feature selection and outlier detection with optimality guarantees, Functional Group Bridge for Simultaneous Regression and Support Estimation, Asymptotic normality of one-step \(M\)-estimators based on non-identically distributed observations, Estimation of banded time-varying precision matrix based on SCAD and group Lasso, Neural network for a class of sparse optimization with \(L_0\)-regularization, Model selection and estimation in high dimensional regression models with group SCAD, Time-varying forecast combination for high-dimensional data, Sparse quantile regression, Identification of microbial features in multivariate regression under false discovery rate control, Impulse noise removal by using a nonconvex TGV regularizer and nonconvex fidelity, Comparing solution paths of sparse quadratic minimization with a Stieltjes matrix, \(\ell_1\)-regularization of high-dimensional time-series models with non-Gaussian and heteroskedastic errors, Model selection in high-dimensional quantile regression with seamless \(L_0\) penalty, Unnamed Item, Linear-step solvability of some folded concave and singly-parametric sparse optimization problems, Asymptotically honest confidence regions for high dimensional parameters by the desparsified conservative Lasso, Sparse estimation: an MMSE approach, Unnamed Item, An unbiased approach to compressed sensing, High-dimensional grouped folded concave penalized estimation via the LLA algorithm, Stochastic correlation coefficient ensembles for variable selection, Hard thresholding regression, Optimal computational and statistical rates of convergence for sparse nonconvex learning problems, Covariate assisted screening and estimation, Targeted Random Projection for Prediction From High-Dimensional Features, Mixed-Effect Time-Varying Network Model and Application in Brain Connectivity Analysis, Variable selection via generalized SELO-penalized Cox regression models, Going beyond oracle property: selection consistency and uniqueness of local solution of the generalized linear model, On the pervasiveness of difference-convexity in optimization and statistics, Portal nodes screening for large scale social networks, A cubic spline penalty for sparse approximation under tight frame balanced model, An efficient non-convex total variation approach for image deblurring and denoising, Pathwise coordinate optimization for sparse learning: algorithm and theory, I-LAMM for sparse learning: simultaneous control of algorithmic complexity and statistical error, Network tail risk estimation in the European banking system, Robust variable selection for finite mixture regression models, Diagonally Dominant Principal Component Analysis, Learning latent variable Gaussian graphical model for biomolecular network with low sample complexity, Elastic net penalized quantile regression model, Unnamed Item, High-dimensional inference: confidence intervals, \(p\)-values and R-software \texttt{hdi}, Strong oracle optimality of folded concave penalized estimation, Minimum average variance estimation with group Lasso for the multivariate response central mean subspace, Broken adaptive ridge regression and its asymptotic properties, A unified primal dual active set algorithm for nonconvex sparse recovery, Discussion of ``Estimating structured high-dimensional covariance and precision matrices: optimal rates and adaptive estimation, Rejoinder of ``Estimating structured high-dimensional covariance and precision matrices: optimal rates and adaptive estimation, QUADRO: a supervised dimension reduction method via Rayleigh quotient optimization, Computation of second-order directional stationary points for group sparse optimization, On integer and MPCC representability of affine sparsity, High-dimensional linear model selection motivated by multiple testing, The de-biased group Lasso estimation for varying coefficient models, Analysis of generalized Bregman surrogate algorithms for nonsmooth nonconvex statistical learning, Unnamed Item, Adaptively weighted group Lasso for semiparametric quantile regression models, Group penalized quantile regression, Nonbifurcating Phylogenetic Tree Inference via the Adaptive LASSO, Unnamed Item, Statistical inference for normal mixtures with unknown number of components, Sample average approximation with sparsity-inducing penalty for high-dimensional stochastic programming, Optimality condition and complexity analysis for linearly-constrained optimization without differentiability on the boundary, Nonlocal robust tensor recovery with nonconvex regularization *, A convex relaxation framework consisting of a primal-dual alternative algorithm for solving \(\ell_0\) sparsity-induced optimization problems with application to signal recovery based image restoration, Unnamed Item, Nonconvex-TV Based Image Restoration with Impulse Noise Removal, Correction: Strong oracle optimality of folded concave penalized estimation, Robust Tensor Completion: Equivalent Surrogates, Error Bounds, and Algorithms


Uses Software


Cites Work