A general theory of concave regularization for high-dimensional sparse estimation problems

From MaRDI portal
Revision as of 02:33, 30 January 2024 by Import240129110155 (talk | contribs) (Created automatically from import240129110155)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)

Publication:5965310

DOI10.1214/12-STS399zbMath1331.62353arXiv1108.4988MaRDI QIDQ5965310

Tong Zhang, Cun-Hui Zhang

Publication date: 3 March 2016

Published in: Statistical Science (Search for Journal in Brave)

Full work available at URL: https://arxiv.org/abs/1108.4988




Related Items (only showing first 100 items - show all)

Robust Signal Recovery for High-Dimensional Linear Log-Contrast Models with Compositional CovariatesA review of distributed statistical inferenceREMI: REGRESSION WITH MARGINAL INFORMATION AND ITS APPLICATION IN GENOME-WIDE ASSOCIATION STUDIESRegularized projection score estimation of treatment effects in high-dimensional quantile regressionBayesian Estimation of Gaussian Conditional Random FieldsFitting sparse linear models under the sufficient and necessary condition for model identificationGlobal solutions to folded concave penalized nonconvex learningBest subset selection via a modern optimization lens\(\ell_0\)-regularized high-dimensional accelerated failure time modelSparse recovery via nonconvex regularized \(M\)-estimators over \(\ell_q\)-ballsOn the strong oracle property of concave penalized estimators with infinite penalty derivative at the originGSDAR: a fast Newton algorithm for \(\ell_0\) regularized generalized linear models with statistical guaranteeDe-biasing the Lasso with degrees-of-freedom adjustmentRandom subspace method for high-dimensional regression with the \texttt{R} package \texttt{regRSM}High-Dimensional Learning Under Approximate Sparsity with Applications to Nonsmooth Estimation and Regularized Neural NetworksHard Thresholding Regularised Logistic Regression: Theory and AlgorithmsNonlinear Variable Selection via Deep Neural NetworksDistributed testing and estimation under sparse high dimensional modelsBias versus non-convexity in compressed sensingThe Spike-and-Slab LASSOVariable selection and parameter estimation with the Atan regularization methodHomogeneity detection for the high-dimensional generalized linear modelPrincipal components adjusted variable screeningThe use of random-effect models for high-dimensional variable selection problemsConditional sure independence screening by conditional marginal empirical likelihoodBalanced estimation for high-dimensional measurement error modelsIn defense of LASSOOracle inequalities for the lasso in the Cox modelAlmost sure uniqueness of a global minimum without convexityRelaxed sparse eigenvalue conditions for sparse estimation via non-convex regularized regressionA doubly sparse approach for group variable selectionFolded concave penalized sparse linear regression: sparsity, statistical performance, and algorithmic theory for local solutionsQuantile regression for additive coefficient models in high dimensionsOn high-dimensional Poisson models with measurement error: hypothesis testing for nonlinear nonconvex optimizationSimultaneous feature selection and outlier detection with optimality guaranteesSparse signal reconstruction via the approximations of \(\ell_0\) quasinormL 0 -regularization for high-dimensional regression with corrupted dataSparse and robust estimation with ridge minimax concave penaltyAdaptive bridge estimator for Cox model with a diverging number of parametersSubspace learning by \(\ell^0\)-induced sparsityA convex-Nonconvex strategy for grouped variable selectionMatrix completion with nonconvex regularization: spectral operators and scalable algorithmsRetire: robust expectile regression in high dimensionsRobust High-Dimensional Regression with Coefficient Thresholding and Its Application to Imaging Data AnalysisNonconvex penalized reduced rank regression and its oracle properties in high dimensionsModel selection in high-dimensional quantile regression with seamless \(L_0\) penaltyCommunication-efficient distributed estimation for high-dimensional large-scale linear regressionSparse estimation via lower-order penalty optimization methods in high-dimensional linear regressionHigh-dimensional composite quantile regression: optimal statistical guarantees and fast algorithmsGoodness-of-Fit Tests for High Dimensional Linear ModelsCalibrating nonconvex penalized regression in ultra-high dimensionEstimation and inference for precision matrices of nonstationary time seriesOn the finite-sample analysis of \(\Theta\)-estimatorsAn unbiased approach to compressed sensingEstimation and variable selection with exponential weightsTime-varying Hazards Model for Incorporating Irregularly Measured, High-Dimensional BiomarkersA two-stage regularization method for variable selection and forecasting in high-order interaction modelHigh-dimensional grouped folded concave penalized estimation via the LLA algorithmOn the finite-sample analysis of \(\Theta\)-estimatorsOptimal computational and statistical rates of convergence for sparse nonconvex learning problemsA Tuning-free Robust and Efficient Approach to High-dimensional RegressionGoing beyond oracle property: selection consistency and uniqueness of local solution of the generalized linear modelRobust low-rank multiple kernel learning with compound regularizationI-LAMM for sparse learning: simultaneous control of algorithmic complexity and statistical errorTruncated $L^1$ Regularized Linear Regression: Theory and AlgorithmPenalized least squares estimation with weakly dependent dataTuning parameter selection for the adaptive LASSO in the autoregressive modelConfidence Intervals for Low Dimensional Parameters in High Dimensional Linear ModelsLearning latent variable Gaussian graphical model for biomolecular network with low sample complexityUnnamed ItemOn a monotone scheme for nonconvex nonsmooth optimization with applications to fracture mechanicsTractable ADMM schemes for computing KKT points and local minimizers for \(\ell_0\)-minimization problemsSorted concave penalized regressionStrong oracle optimality of folded concave penalized estimationEndogeneity in high dimensionsBayesian Bootstrap Spike-and-Slab LASSOA unified primal dual active set algorithm for nonconvex sparse recoveryIntroduction to the special issue on sparsity and regularization methodsEstimating structured high-dimensional covariance and precision matrices: optimal rates and adaptive estimationVariance prior forms for high-dimensional Bayesian variable selectionOR Forum—An Algorithmic Approach to Linear RegressionIteratively reweighted \(\ell_1\)-penalized robust regressionAsymptotic normality and optimalities in estimation of large Gaussian graphical modelsHigh-dimensional linear model selection motivated by multiple testingMajorized proximal alternating imputation for regularized rank constrained matrix completionSecond-order Stein: SURE for SURE and other applications in high-dimensional inferenceDynamic variable selection with spike-and-slab process priorsA Simple Method for Estimating Interactions Between a Treatment and a Large Number of CovariatesSimultaneous Variable and Covariance Selection With the Multivariate Spike-and-Slab LASSONonnegative estimation and variable selection under minimax concave penalty for sparse high-dimensional linear regression modelsSmoothing Newton method for \(\ell^0\)-\(\ell^2\) regularized linear inverse problemA theoretical understanding of self-paced learningFast Best Subset Selection: Coordinate Descent and Local Combinatorial Optimization AlgorithmsAccelerated Stochastic Algorithms for Nonconvex Finite-Sum and Multiblock OptimizationWeighted thresholding homotopy method for sparsity constrained optimizationNonbifurcating Phylogenetic Tree Inference via the Adaptive LASSOJoint feature screening for ultra-high-dimensional sparse additive hazards model by the sparsity-restricted pseudo-score estimatorSample average approximation with sparsity-inducing penalty for high-dimensional stochastic programmingRobust moderately clipped LASSO for simultaneous outlier detection and variable selectionHigh-dimensional linear regression with hard thresholding regularization: theory and algorithm


Uses Software


Cites Work


This page was built for publication: A general theory of concave regularization for high-dimensional sparse estimation problems