Shrinkage tuning parameter selection with a diverging number of parameters
From MaRDI portal
Publication:2920262
DOI10.1111/J.1467-9868.2008.00693.XzbMATH Open1250.62036OpenAlexW2050031210MaRDI QIDQ2920262FDOQ2920262
Authors: Hansheng Wang, Bo Li, Chenlei Leng
Publication date: 25 October 2012
Published in: Journal of the Royal Statistical Society. Series B. Statistical Methodology (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1111/j.1467-9868.2008.00693.x
Recommendations
- Consistent tuning parameter selection in high dimensional sparse linear regression
- Tuning parameter selectors for the smoothly clipped absolute deviation method
- On the adaptive elastic net with a diverging number of parameters
- Consistent selection of tuning parameters via variable selection stability
- A generalized Dantzig selector with shrinkage tuning
Cites Work
- Estimating the dimension of a model
- The Adaptive Lasso and Its Oracle Properties
- Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties
- Title not available (Why is that?)
- SCAD-penalized regression in high-dimensional partially linear models
- Can the strengths of AIC and BIC be shared? A conflict between model indentification and regression estimation
- Unified LASSO Estimation by Least Squares Approximation
- Title not available (Why is that?)
- Asymptotic properties of bridge estimators in sparse high-dimensional regression models
- Adaptive Lasso for Cox's proportional hazards model
- Regression Model Selection—A Residual Likelihood Approach
- Consistent linear model selection
Cited In (only showing first 100 items - show all)
- Semiparametric Bayesian information criterion for model selection in ultra-high dimensional additive models
- Variable selection in linear measurement error models via penalized score functions
- Using penalized EM algorithm to infer learning trajectories in latent transition CDM
- A study on tuning parameter selection for the high-dimensional lasso
- Globally adaptive quantile regression with ultra-high dimensional data
- Consistent tuning parameter selection in high-dimensional group-penalized regression
- Shrinkage tuning parameter selection in precision matrices estimation
- Promote sign consistency in the joint estimation of precision matrices
- Model Selection via Bayesian Information Criterion for Quantile Regression Models
- Robust statistical inference for longitudinal data with nonignorable dropouts
- Smooth-Threshold GEE Variable Selection in High-Dimensional Partially Linear Models with Longitudinal Data
- Local linear smoothing for sparse high dimensional varying coefficient models
- Histopathological imaging‐based cancer heterogeneity analysis via penalized fusion with model averaging
- Change-point detection for variance piecewise constant models
- Variable selection in high-dimensional double generalized linear models
- A Lasso-penalized BIC for mixture model selection
- Longitudinal clustering for heterogeneous binary data
- Factor modelling for high-dimensional time series: inference and model selection
- Rank reduction for high-dimensional generalized additive models
- Smooth predictive model fitting in regression
- Further asymptotic properties of the generalized information criterion
- An \(L_1\)-regularized logistic model for detecting short-term neuronal interactions
- A model-averaging approach for high-dimensional regression
- Sparse group variable selection based on quantile hierarchical Lasso
- Detection and estimation of block structure in spatial weight matrix
- A tuning-free robust and efficient approach to high-dimensional regression
- Model selection in sparse high-dimensional vine copula models with an application to portfolio risk
- An alternating direction method of multipliers for MCP-penalized regression with high-dimensional data
- Title not available (Why is that?)
- Tuning parameter selection in sparse regression modeling
- Smoothed partially linear quantile regression with nonignorable missing response
- SCAD-penalized regression in additive partially linear proportional hazards models with an ultra-high-dimensional linear part
- Variable selection in high-dimensional partly linear additive models
- Smoothed empirical likelihood inference and variable selection for quantile regression with nonignorable missing response
- Exploring dimension learning via a penalized probabilistic principal component analysis
- Penalized empirical likelihood for the sparse Cox regression model
- Texture analysis using Gaussian graphical models
- A note on the consistency of Schwarz's criterion in linear quantile regression with the SCAD penalty
- Quadratic approximation for nonconvex penalized estimations with a diverging number of parameters
- Shrinkage averaging estimation
- Variable Selection for Semiparametric Partially Linear Covariate-Adjusted Regression Models
- AIC for the Lasso in generalized linear models
- Exponentially tilted likelihood inference on growing dimensional unconditional moment models
- Empirical likelihood for censored linear regression and variable selection
- On skewed Gaussian graphical models
- Least squares approximation with a diverging number of parameters
- Mixed Lasso estimator for stochastic restricted regression models
- Variable selection and parameter estimation with the Atan regularization method
- A relative error-based approach for variable selection
- Modeling trend processes in parametric mortality models
- Bayesian high-dimensional screening via MCMC
- Simultaneous variable selection for joint models of longitudinal and survival outcomes
- Efficient approximate \(k\)-fold and leave-one-out cross-validation for ridge regression
- Regularized latent class analysis with application in cognitive diagnosis
- Calibrating nonconvex penalized regression in ultra-high dimension
- Bayesian hyper-Lassos with non-convex penalization
- Quadratic approximation via the SCAD penalty with a diverging number of parameters
- Variable Selection Using a Smooth Information Criterion for Distributional Regression Models
- Nested coordinate descent algorithms for empirical likelihood
- Hybrid generalized empirical likelihood estimators: instrument selection with adaptive lasso
- Penalized empirical likelihood for generalized linear models with longitudinal data
- Sparse estimation in functional linear regression
- Sparse and efficient estimation for partial spline models with increasing dimension
- Adaptive lasso for generalized linear models with a diverging number of parameters
- Smoothed quantile regression with nonignorable dropouts
- A distribution-based Lasso for a general single-index model
- Consistent tuning parameter selection in high dimensional sparse linear regression
- Semi-varying coefficient models with a diverging number of components
- Nonconcave penalized estimation for partially linear models with longitudinal data
- Parsimonious Model Averaging With a Diverging Number of Parameters
- A pseudo-heuristic parameter selection rule for \(l^1\)-regularized minimization problems
- Variable selection for partially linear varying coefficient quantile regression model
- Penalized empirical likelihood for high-dimensional generalized linear models with longitudinal data
- Spline estimation and variable selection for single-index prediction models with diverging number of index parameters
- Individualized Multidirectional Variable Selection
- Tuning parameter selection for the adaptive LASSO in the autoregressive model
- Individual Data Protected Integrative Regression Analysis of High-Dimensional Heterogeneous Data
- Multivariate spatiotemporal models with low rank coefficient matrix
- Logical and test consistency in pairwise multiple comparisons
- Order selection for regression-based hidden Markov model
- Penalized generalized empirical likelihood in high-dimensional weakly dependent data
- Adaptive penalized weighted least absolute deviations estimation for the accelerated failure time model
- AIC for the non-concave penalized likelihood method
- Subgroup analysis for the functional linear model
- Estimation of Sparse Structural Parameters with Many Endogenous Variables
- Matrix regression heterogeneity analysis
- Ultra-high dimensional variable screening via Gram-Schmidt orthogonalization
- Optimal regression parameter-specific shrinkage by plug-in estimation
- Sparsely restricted penalized estimators
- In defense of LASSO
- Tuning parameter selection for penalised empirical likelihood with a diverging number of parameters
- Non-marginal feature screening for additive hazard model with ultrahigh-dimensional covariates
- Using penalized likelihood to select parameters in a random coefficients multinomial logit model
- Combined-penalized likelihood estimations with a diverging number of parameters
- Oracle efficient estimation of structural breaks in cointegrating regressions
- Bayesian latent factor on image regression with nonignorable missing data
- A systematic review on model selection in high-dimensional regression
- Distributed Decoding From Heterogeneous 1-Bit Compressive Measurements
- Heterogeneous quantile regression for longitudinal data with subgroup structures
- Penalized empirical likelihood inference for the GINAR(p) model
This page was built for publication: Shrinkage tuning parameter selection with a diverging number of parameters
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2920262)