SLOPE is adaptive to unknown sparsity and asymptotically minimax

From MaRDI portal
Publication:292875

DOI10.1214/15-AOS1397zbMath1338.62032arXiv1503.08393OpenAlexW2963943067MaRDI QIDQ292875

Weijie Su, Emmanuel J. Candès

Publication date: 9 June 2016

Published in: The Annals of Statistics (Search for Journal in Brave)

Full work available at URL: https://arxiv.org/abs/1503.08393



Related Items

Fundamental barriers to high-dimensional regression with convex penalties, Safe Rules for the Identification of Zeros in the Solutions of the SLOPE Problem, Iterative algorithm for discrete structure recovery, SLOPE is adaptive to unknown sparsity and asymptotically minimax, Adaptive Huber regression on Markov-dependent data, Sparse index clones via the sorted ℓ1-Norm, Bayesian factor-adjusted sparse regression, The Spike-and-Slab LASSO, Adaptive Bayesian SLOPE: Model Selection With Incomplete Data, Group SLOPE – Adaptive Selection of Groups of Predictors, Predictor ranking and false discovery proportion control in high-dimensional regression, Characterizing the SLOPE trade-off: a variational perspective and the Donoho-Tanner limit, Fundamental limits of weak recovery with applications to phase retrieval, Optimal false discovery control of minimax estimators, Robust machine learning by median-of-means: theory and practice, On the asymptotic properties of SLOPE, On spike and slab empirical Bayes multiple testing, RANK: Large-Scale Inference With Graphical Nonlinear Knockoffs, Oracle inequalities for high-dimensional prediction, Improved bounds for square-root Lasso and square-root slope, Bayesian estimation of sparse signals with a continuous spike-and-slab prior, Slope meets Lasso: improved oracle bounds and optimality, Debiasing the Lasso: optimal sample size for Gaussian designs, Regularization and the small-ball method. I: Sparse recovery, Sparse inference of the drift of a high-dimensional Ornstein-Uhlenbeck process, Learning from MOM's principles: Le Cam's approach, Variable selection via adaptive false negative control in linear regression, Sorted concave penalized regression, On the exponentially weighted aggregate with the Laplace prior, Degrees of freedom in submodular regularization: a computational perspective of Stein's unbiased risk estimate, Sharp oracle inequalities for low-complexity priors, Simple expressions of the LASSO and SLOPE estimators in low-dimension, Estimation bounds and sharp oracle inequalities of regularized procedures with Lipschitz loss functions, Sharp Oracle Inequalities for Square Root Regularization, Regularization and the small-ball method II: complexity dependent error rates, Nonregular and minimax estimation of individualized thresholds in high dimension with binary responses, Unnamed Item, Iterative gradient descent for outlier detection, A Unifying Tutorial on Approximate Message Passing


Uses Software


Cites Work