Pages that link to "Item:Q1952206"
From MaRDI portal
The following pages link to The adaptive and the thresholded Lasso for potentially misspecified models (and a lower bound for the Lasso) (Q1952206):
Displaying 37 items.
- Regularized estimation in sparse high-dimensional time series models (Q127754) (← links)
- Orthogonal one step greedy procedure for heteroscedastic linear models (Q254223) (← links)
- Best subset selection via a modern optimization lens (Q282479) (← links)
- Thresholding least-squares inference in high-dimensional regression models (Q309566) (← links)
- \(\ell_{0}\)-penalized maximum likelihood for sparse directed acyclic graphs (Q355087) (← links)
- Statistical significance in high-dimensional linear models (Q373525) (← links)
- CAM: causal additive models, high-dimensional order search and penalized regression (Q482906) (← links)
- Regularized rank-based estimation of high-dimensional nonparanormal graphical models (Q741796) (← links)
- Endogenous treatment effect estimation using high-dimensional instruments and double selection (Q826717) (← links)
- Efficient estimation of approximate factor models via penalized maximum likelihood (Q898581) (← links)
- Monotone splines Lasso (Q1623607) (← links)
- High-dimensional simultaneous inference with the bootstrap (Q1694480) (← links)
- High-dimensional inference: confidence intervals, \(p\)-values and R-software \texttt{hdi} (Q1790302) (← links)
- High-dimensional variable selection via low-dimensional adaptive learning (Q2044323) (← links)
- Ridge regression revisited: debiasing, thresholding and bootstrap (Q2148980) (← links)
- Rejoinder on: ``Hierarchical inference for genome-wide association studies: a view on methodology with software'' (Q2184393) (← links)
- A look at robustness and stability of \(\ell_1\)-versus \(\ell_0\)-regularization: discussion of papers by Bertsimas et al. and Hastie et al. (Q2225318) (← links)
- High-dimensional variable screening and bias in subsequent inference, with an empirical comparison (Q2259726) (← links)
- Lasso and probabilistic inequalities for multivariate point processes (Q2345116) (← links)
- Preconditioning the Lasso for sign consistency (Q2346526) (← links)
- D-trace estimation of a precision matrix using adaptive lasso penalties (Q2418368) (← links)
- Calibrating nonconvex penalized regression in ultra-high dimension (Q2438760) (← links)
- An<i>ℓ</i><sub>1</sub>-oracle inequality for the Lasso in multivariate finite mixture of multivariate Gaussian regression models (Q2786498) (← links)
- Estimation for High-Dimensional Linear Mixed-Effects Models Using ℓ1-Penalization (Q2911662) (← links)
- (Q4998948) (← links)
- Variable Selection With Second-Generation <i>P</i>-Values (Q5050808) (← links)
- Model Selection With Lasso-Zero: Adding Straw to the Haystack to Better Find Needles (Q5066436) (← links)
- Robust recovery of signals with partially known support information using weighted BPDN (Q5132234) (← links)
- Quasi-likelihood and/or robust estimation in high dimensions (Q5965304) (← links)
- Discussion of: ``Grouping strategies and thresholding for high dimension linear models'' (Q5965590) (← links)
- A Critical Review of LASSO and Its Derivatives for Variable Selection Under Dependence Among Covariates (Q6067162) (← links)
- A Unified Framework for Change Point Detection in High-Dimensional Linear Models (Q6069892) (← links)
- Weak Signal Identification and Inference in Penalized Likelihood Models for Categorical Responses (Q6086164) (← links)
- Testing stochastic dominance with many conditioning variables (Q6108264) (← links)
- A power analysis for Model-X knockoffs with \(\ell_p\)-regularized statistics (Q6136579) (← links)
- Positive-definite thresholding estimators of covariance matrices with zeros (Q6168115) (← links)
- An integrated surrogate model constructing method: annealing combinable Gaussian process (Q6188162) (← links)