A power analysis for Model-X knockoffs with _p-regularized statistics
DOI10.1214/23-AOS2274arXiv2007.15346OpenAlexW4386035722MaRDI QIDQ6136579FDOQ6136579
Emmanuel J. Candès, Rina Foygel Barber, Weijie J. Su, Małgorzata Bogdan, Asaf Weinstein
Publication date: 31 August 2023
Published in: The Annals of Statistics (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/2007.15346
Parametric hypothesis testing (62F03) Asymptotic properties of parametric estimators (62F12) Asymptotic properties of parametric tests (62F05) Statistical ranking and selection procedures (62F07)
Cites Work
- Type S error for classical and Bayesian single and multiple comparison procedures
- The Adaptive Lasso and Its Oracle Properties
- Statistics for high-dimensional data. Methods, theory and applications.
- High-dimensional graphs and variable selection with the Lasso
- Title not available (Why is that?)
- Sharp Thresholds for High-Dimensional and Noisy Sparsity Recovery Using $\ell _{1}$-Constrained Quadratic Programming (Lasso)
- A knockoff filter for high-dimensional selective inference
- Adaptive linear step-up procedures that control the false discovery rate
- False discoveries occur early on the Lasso path
- Adaptive false discovery rate control under independence and dependence
- Panning for Gold: ‘Model-X’ Knockoffs for High Dimensional Controlled Variable Selection
- Controlling the false discovery rate via knockoffs
- The adaptive and the thresholded Lasso for potentially misspecified models (and a lower bound for the Lasso)
- SLOPE-adaptive variable selection via convex optimization
- Rate minimaxity of the Lasso and Dantzig selector for the \(l_{q}\) loss in \(l_{r}\) balls
- Observed universality of phase transitions in high-dimensional geometry, with implications for modern data analysis and signal processing
- The LASSO Risk for Gaussian Matrices
- The Dynamics of Message Passing on Dense Graphs, with Applications to Compressed Sensing
- A necessary and sufficient condition for exact sparse recovery by \(\ell_1\) minimization
- Conclusions vs Decisions
- Overcoming the limitations of phase transition by higher order analysis of regularization techniques
- A high-dimensional power analysis of the conditional randomization test and knockoffs
- Which bridge estimator is the best for variable selection?
- On the sign recovery by least absolute shrinkage and selection operator, thresholded least absolute shrinkage and selection operator, and thresholded basis pursuit denoising
- A power analysis for Model-X knockoffs with \(\ell_p\)-regularized statistics
Cited In (3)
Recommendations
- The \(k\)th power expectile regression 👍 👎
- Likelihood-based inference for the power regression model 👍 👎
- A high-dimensional power analysis of the conditional randomization test and knockoffs 👍 👎
- A unified approach to power calculation and sample size determination for random regression models 👍 👎
- Improved kth power expectile regression with nonignorable dropouts 👍 👎
- The exact power function of an exact test of a regression model against multiple separate alternatives 👍 👎
- Explicit estimators of an unknown parameter in a power regression problem 👍 👎
- On improvement of statistical estimators in a power regression problem 👍 👎
This page was built for publication: A power analysis for Model-X knockoffs with \(\ell_p\)-regularized statistics
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6136579)