Variable Selection With Second-Generation P-Values
From MaRDI portal
Publication:5050808
DOI10.1080/00031305.2021.1946150OpenAlexW3177496675MaRDI QIDQ5050808
Thomas G. Stewart, Yi Zuo, Jeffrey D. Blume
Publication date: 18 November 2022
Published in: The American Statistician (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/2012.07941
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Nearly unbiased variable selection under minimax concave penalty
- Quantile universal threshold
- The Adaptive Lasso and Its Oracle Properties
- Regression modeling strategies. With applications to linear models, logistic regression, and survival analysis
- Some sharp performance bounds for least squares regression with \(L_1\) regularization
- High-dimensional variable selection
- SLOPE-adaptive variable selection via convex optimization
- To explain or to predict?
- Lasso-type recovery of sparse representations for high-dimensional data
- Relaxed Lasso
- Asymptotics for Lasso-type estimators.
- The adaptive and the thresholded Lasso for potentially misspecified models (and a lower bound for the Lasso)
- Which bridge estimator is the best for variable selection?
- Best subset, forward stepwise or Lasso? Analysis and recommendations based on extensive comparisons
- Calibrating nonconvex penalized regression in ultra-high dimension
- High-dimensional graphs and variable selection with the Lasso
- Asymptotic Equivalence of Regularization Methods in Thresholded Parameter Space
- When does more regularization imply fewer degrees of freedom? Sufficient conditions and counterexamples
- Forward Regression for Ultra-High Dimensional Variable Screening
- Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties
- Outcome‐adaptive lasso: Variable selection for causal inference
- Variable selection – A review and recommendations for the practicing statistician
- Regularization after retention in ultrahigh dimensional linear regression models
- Hard thresholding regression
- Sure Independence Screening for Ultrahigh Dimensional Feature Space
- Information-Theoretic Limits on Sparsity Recovery in the High-Dimensional and Noisy Setting
- Sharp Thresholds for High-Dimensional and Noisy Sparsity Recovery Using $\ell _{1}$-Constrained Quadratic Programming (Lasso)
- Effective degrees of freedom: a flawed metaphor
- Nonconcave Penalized Likelihood With NP-Dimensionality
- Regularized M-estimators with nonconvexity: Statistical and algorithmic theory for local optima
- Tuning Parameter Selection in High Dimensional Penalized Likelihood
- High Dimensional Thresholded Regression and Shrinkage Effect
- An Introduction to Second-Generation p-Values
This page was built for publication: Variable Selection With Second-Generation P-Values