scientific article; zbMATH DE number 5957408
From MaRDI portal
Publication:3174050
zbMATH Open1222.62008MaRDI QIDQ3174050FDOQ3174050
Publication date: 12 October 2011
Full work available at URL: http://www.jmlr.org/papers/v7/zhao06a.html
Title of this publication is not available (Why is that?)
Recommendations
- Model selection consistency of Lasso for empirical data
- Lasso with convex loss: model selection consistency and estimation
- A note on the Lasso and related procedures in model selection
- A random model approach for the LASSO
- Strong consistency of Lasso estimators
- The Lasso as an \(\ell _{1}\)-ball model selection procedure
- Regularizing LASSO: a consistent variable selection method
- Model selection consistency of \(U\)-statistics with convex loss and weighted Lasso penalty
- Model selection with mixed variables on the Lasso path
- Improving Lasso for model selection and prediction
Cited In (only showing first 100 items - show all)
- Title not available (Why is that?)
- Recovery of partly sparse and dense signals
- Shrinkage tuning parameter selection in precision matrices estimation
- Discussion: ``A significance test for the lasso
- Discussion: ``A significance test for the lasso
- Discussion: ``A significance test for the lasso
- Discussion: ``A significance test for the lasso
- Gene-environment interaction analysis under the Cox model
- Sparse regression: scalable algorithms and empirical performance
- The smooth-Lasso and other \(\ell _{1}+\ell _{2}\)-penalized methods
- Sorted concave penalized regression
- Robust estimation for an inverse problem arising in multiview geometry
- Multi-stage convex relaxation for feature selection
- The predictive Lasso
- Variational Bayes for High-Dimensional Linear Regression With Sparse Priors
- Adaptive log-density estimation
- The slow, steady ascent of a hot solid sphere in a Newtonian fluid with strongly temperature-dependent viscosity
- Generalized Kalman smoothing: modeling and algorithms
- Discussion: Latent variable graphical model selection via convex optimization
- Variable selection in heteroscedastic single-index quantile regression
- Nonparametric variable selection and its application to additive models
- Sparse factor regression via penalized maximum likelihood estimation
- Interquantile shrinkage in spatial additive autoregressive models
- Sparse system identification for stochastic systems with general observation sequences
- A consistent algorithm to solve Lasso, elastic-net and Tikhonov regularization
- The Dantzig selector: recovery of signal via ℓ 1 − αℓ 2 minimization
- Sparse recovery via differential inclusions
- Sparse high-dimensional linear regression. Estimating squared error and a phase transition
- Tight conditions for consistency of variable selection in the context of high dimensionality
- Searching for minimal optimal neural networks
- Rapid penalized likelihood-based outlier detection via heteroskedasticity test
- High-dimensional Gaussian model selection on a Gaussian design
- Bayesian high-dimensional screening via MCMC
- Sparse regression and support recovery with \(\mathbb{L}_2\)-boosting algorithms
- The variational Garrote
- Feature selection guided by structural information
- CAM: causal additive models, high-dimensional order search and penalized regression
- High-dimensional Bayesian inference in nonparametric additive models
- Model selection and estimation in high dimensional regression models with group SCAD
- Discussion: One-step sparse estimates in nonconcave penalized likelihood models
- Greedy forward regression for variable screening
- A Note on High-Dimensional Linear Regression With Interactions
- Laplace error penalty-based variable selection in high dimension
- Weighted \(\ell_1\)-penalized corrected quantile regression for high dimensional measurement error models
- On nonparametric feature filters in electromagnetic imaging
- Model selection via adaptive shrinkage with \(t\) priors
- Shrinkage and model selection with correlated variables via weighted fusion
- Generalized M-estimators for high-dimensional Tobit I models
- Going beyond oracle property: selection consistency and uniqueness of local solution of the generalized linear model
- Variable selection via RIVAL (removing irrelevant variables amidst lasso iterations) and its application to nuclear material detection
- On asymptotic risk of selecting models for possibly nonstationary time-series
- Dynamic networks with multi-scale temporal structure
- Consistent group selection with Bayesian high dimensional modeling
- Interpreting latent variables in factor models via convex optimization
- On path restoration for censored outcomes
- Penalized least squares estimation with weakly dependent data
- Tuning parameter selection for the adaptive LASSO in the autoregressive model
- Adaptive and reversed penalty for analysis of high-dimensional correlated data
- Fitting sparse linear models under the sufficient and necessary condition for model identification
- Nonsmoothness in machine learning: specific structure, proximal identification, and applications
- A reproducing kernel Hilbert space approach to high dimensional partially varying coefficient model
- Penalized logspline density estimation using total variation penalty
- Image denoising via solution paths
- Estimation of an oblique structure via penalized likelihood factor analysis
- Structured variable selection via prior-induced hierarchical penalty functions
- A penalized likelihood method for structural equation modeling
- Oracle inequalities for sparse additive quantile regression in reproducing kernel Hilbert space
- Regularization and the small-ball method. I: Sparse recovery
- Parametric and semiparametric reduced-rank regression with flexible sparsity
- Model selection consistency of Lasso for empirical data
- Nonparametric independence screening via favored smoothing bandwidth
- Linear hypothesis testing in dense high-dimensional linear models
- On the post selection inference constant under restricted isometry properties
- Broken adaptive ridge regression and its asymptotic properties
- Necessary and sufficient conditions for variable selection consistency of the Lasso in high dimensions
- Low complexity regularization of linear inverse problems
- Discussion: Latent variable graphical model selection via convex optimization
- Sensitivity analysis for mirror-stratifiable convex functions
- Penalized Regression for Multiple Types of Many Features With Missing Data
- A tuning-free robust and efficient approach to high-dimensional regression
- Robust machine learning by median-of-means: theory and practice
- Adaptive estimation of covariance matrices via Cholesky decomposition
- Uniformly valid confidence sets based on the Lasso
- Tuning parameter calibration for \(\ell_1\)-regularized logistic regression
- Sparse linear models and \(l_1\)-regularized 2SLS with high-dimensional endogenous regressors and instruments
- Simultaneous feature selection and clustering based on square root optimization
- High-dimensional sparse portfolio selection with nonnegative constraint
- REMI: REGRESSION WITH MARGINAL INFORMATION AND ITS APPLICATION IN GENOME-WIDE ASSOCIATION STUDIES
- Exploiting Disagreement Between High-Dimensional Variable Selectors for Uncertainty Visualization
- Learning theory approach to a system identification problem involving atomic norm
- A First-Order Optimization Algorithm for Statistical Learning with Hierarchical Sparsity Structure
- High-dimensional Cox models: the choice of penalty as part of the model building process
- Efficient nonconvex sparse group feature selection via continuous and discrete optimization
- Second-order Stein: SURE for SURE and other applications in high-dimensional inference
- Online streaming feature selection using rough sets
- Robust inference for high‐dimensional single index models
- Efficient Threshold Selection for Multivariate Total Variation Denoising
- High-dimensional model recovery from random sketched data by exploring intrinsic sparsity
- Lassoing the HAR model: a model selection perspective on realized volatility dynamics
- A review of Gaussian Markov models for conditional independence
This page was built for publication:
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q3174050)