Statistical learning and selective inference
From MaRDI portal
Publication:2962284
DOI10.1073/pnas.1507583112zbMath1359.62228OpenAlexW1885924565WikidataQ24289252 ScholiaQ24289252MaRDI QIDQ2962284
Jonathan E. Taylor, Robert Tibshirani
Publication date: 16 February 2017
Published in: Proceedings of the National Academy of Sciences (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1073/pnas.1507583112
Factor analysis and principal components; correspondence analysis (62H25) Ridge regression; shrinkage estimators (Lasso) (62J07) General nonlinear regression (62J02) Learning and adaptive systems in artificial intelligence (68T05)
Related Items
Strong selection consistency of Bayesian vector autoregressive models based on a pseudo-likelihood approach, Probabilistic learning inference of boundary value problem with uncertainties based on Kullback-Leibler divergence under implicit constraints, Demystifying the bias from selective inference: a revisit to Dawid's treatment selection problem, Least-Square Approximation for a Distributed System, Estimation and Inference of Heterogeneous Treatment Effects using Random Forests, Variable selection – A review and recommendations for the practicing statistician, Frequentist validity of Bayesian limits, Post-selection inference of generalized linear models based on the lasso and the elastic net, Dirichlet process mixture models for insurance loss data, Covariate‐driven factorization by thresholding for multiblock data, Filtering the Rejection Set While Preserving False Discovery Rate Control, Tuning parameter selection for penalized estimation via \(R^2\), Unnamed Item, Selective inference for clustering with unknown variance, Statistical proof? The problem of irreproducibility, Post hoc confidence bounds on false positives using reference families, Neighborhood-based cross fitting approach to treatment effects with high-dimensional data, A Complete Framework for Model-Free Difference-in-Differences Estimation, Post-selection inference via algorithmic stability, Polynomial-chaos-based conditional statistics for probabilistic learning with heterogeneous data applied to atomic collisions of helium on graphite substrate, Asymptotically honest confidence regions for high dimensional parameters by the desparsified conservative Lasso, Entropy-based closure for probabilistic learning on manifolds, A note on Type S/M errors in hypothesis testing, Probabilistic learning on manifolds constrained by nonlinear partial differential equations for small datasets, Likelihood Ratio Test in Multivariate Linear Regression: from Low to High Dimension, Capturing Spike Variability in Noisy Izhikevich Neurons Using Point Process Generalized Linear Models, Exploratory subgroup analysis in clinical trials by model selection, The principal problem with principal components regression, Log-Contrast Regression with Functional Compositional Predictors: Linking Preterm Infant's Gut Microbiome Trajectories to Neurobehavioral Outcome, Valid Inference Corrected for Outlier Removal, In defense of the indefensible: a very naïve approach to high-dimensional inference, Statistical Inference Enables Bad Science; Statistical Thinking Enables Good Science
Uses Software
Cites Work
- Unnamed Item
- Inference in adaptive regression via the Kac-Rice formula
- Valid post-selection inference
- Selecting the number of principal components: estimation of the true rank of a noisy matrix
- High-dimensional variable selection
- A significance test for the lasso
- Statistical significance for genomewide studies
- For most large underdetermined systems of linear equations the minimal 𝓁1‐norm solution is also the sparsest solution