Selective inference with a randomized response
From MaRDI portal
Publication:1750283
DOI10.1214/17-AOS1564zbMath1392.62144arXiv1507.06739OpenAlexW2963901036MaRDI QIDQ1750283
Xiaoying Tian, Jonathan E. Taylor
Publication date: 18 May 2018
Published in: The Annals of Statistics (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1507.06739
Ridge regression; shrinkage estimators (Lasso) (62J07) Linear regression; mixed models (62J05) Nonparametric inference (62G99)
Related Items
A unified theory of confidence regions and testing for high-dimensional estimating equations, Post-model-selection inference in linear regression models: an integrated review, Scalable methods for Bayesian selective inference, Optimal finite sample post-selection confidence distributions in generalized linear models, Post‐selection inference for changepoint detection algorithms with application to copy number variation data, Integrative Bayesian Models Using Post-Selective Inference: A Case Study in Radiogenomics, Selective inference after feature selection via multiscale bootstrap, More Powerful Selective Inference for the Graph Fused Lasso, Prediction error after model search, Empirical Bayes and selective inference, Post-selection inference via algorithmic stability, Carving model-free inference, Approximate Selective Inference via Maximum Likelihood, Exact post-selection inference for the generalized Lasso path, Approximate \(\ell_0\)-penalized estimation of piecewise-constant signals on graphs, Selective inference with a randomized response, Distribution-Free Predictive Inference For Regression, Bootstrapping and sample splitting for high-dimensional, assumption-lean inference, Inference after estimation of breaks, Multicarving for high-dimensional post-selection inference, Selective inference for latent block models, Integrative methods for post-selection inference under convex constraints, Valid Inference Corrected for Outlier Removal, Some perspectives on inference in high dimensions, Conditional selective inference for robust regression and outlier detection using piecewise-linear homotopy continuation, Testing goodness-of-fit and conditional independence with approximate co-sufficient sampling
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- On asymptotically optimal confidence regions and tests for high-dimensional models
- Exact post-selection inference, with application to the Lasso
- Exact and asymptotically robust permutation tests
- Statistical significance in high-dimensional linear models
- On the rate of convergence in the multivariate CLT
- High-dimensional variable selection
- Uniform asymptotic inference and the bootstrap after model selection
- Selective inference with a randomized response
- A significance test for the lasso
- Confidence Intervals and Hypothesis Testing for High-Dimensional Regression
- Preserving Statistical Validity in Adaptive Data Analysis
- p-Values for High-Dimensional Regression
- Square-root lasso: pivotal recovery of sparse signals via conic programming
- Nonequivariant Simultaneous Confidence Intervals Less Likely to Contain Zero
- Scaled sparse linear regression
- A note on data-splitting for the evaluation of significance levels
- Selective inference with unknown variance via the square-root lasso
- Stability Selection
- Regression Shrinkage and Selection via The Lasso: A Retrospective
- A Note on Quantiles in Large Samples
- Asymptotics of Selective Inference
- Confidence Intervals for Low Dimensional Parameters in High Dimensional Linear Models