Pages that link to "Item:Q5743163"
From MaRDI portal
The following pages link to Tuning Parameter Selection in High Dimensional Penalized Likelihood (Q5743163):
Displaying 50 items.
- Selection by partitioning the solution paths (Q114375) (← links)
- Log-Contrast Regression with Functional Compositional Predictors: Linking Preterm Infant's Gut Microbiome Trajectories to Neurobehavioral Outcome (Q138028) (← links)
- Detecting weak signals in high dimensions (Q272081) (← links)
- Random subspace method for high-dimensional regression with the \texttt{R} package \texttt{regRSM} (Q311298) (← links)
- Oracle inequalities, variable selection and uniform inference in high-dimensional correlated random effects panel data models (Q311643) (← links)
- A Dirichlet process functional approach to heteroscedastic-consistent covariance estimation (Q324685) (← links)
- Lasso penalized model selection criteria for high-dimensional multivariate linear regression analysis (Q458641) (← links)
- Smooth predictive model fitting in regression (Q512005) (← links)
- Using penalized EM algorithm to infer learning trajectories in latent transition CDM (Q823864) (← links)
- A self-calibrated direct approach to precision matrix estimation and linear discriminant analysis in high dimensions (Q829737) (← links)
- Tuning-free ridge estimators for high-dimensional generalized linear models (Q830109) (← links)
- High-dimensional \(A\)-learning for optimal dynamic treatment regimes (Q1650064) (← links)
- Homogeneity detection for the high-dimensional generalized linear model (Q1658352) (← links)
- The use of random-effect models for high-dimensional variable selection problems (Q1659014) (← links)
- On the sign consistency of the Lasso for the high-dimensional Cox model (Q1661333) (← links)
- Regularized latent class analysis with application in cognitive diagnosis (Q1682442) (← links)
- Asymptotically honest confidence regions for high dimensional parameters by the desparsified conservative Lasso (Q1706454) (← links)
- Variable screening for high dimensional time series (Q1746535) (← links)
- Pairwise fusion approach incorporating prior constraint information (Q1988277) (← links)
- Model selection in sparse high-dimensional vine copula models with an application to portfolio risk (Q2001097) (← links)
- Efficient regularized regression with \(L_0\) penalty for variable selection and network construction (Q2011726) (← links)
- High-dimensional variable selection via low-dimensional adaptive learning (Q2044323) (← links)
- Clustering of subsample means based on pairwise L1 regularized empirical likelihood (Q2046479) (← links)
- Sparse spatially clustered coefficient model via adaptive regularization (Q2084064) (← links)
- Penalized quasi-likelihood estimation of generalized Pareto regression -- consistent identification of risk factors for extreme losses (Q2138617) (← links)
- Hierarchical correction of \(p\)-values via an ultrametric tree running Ornstein-Uhlenbeck process (Q2155000) (← links)
- Parallel integrative learning for large-scale multi-response regression with incomplete outcomes (Q2242011) (← links)
- High-dimensional mean estimation via \(\ell_1\) penalized normal likelihood (Q2252887) (← links)
- Kernel density regression (Q2301065) (← links)
- Group variable selection in the Andersen-Gill model for recurrent event data (Q2301106) (← links)
- Tuning parameter calibration for \(\ell_1\)-regularized logistic regression (Q2317308) (← links)
- Linear hypothesis testing for high dimensional generalized linear models (Q2328055) (← links)
- AIC for the non-concave penalized likelihood method (Q2414941) (← links)
- Dependence modelling in ultra high dimensions with vine copulas and the graphical Lasso (Q2416782) (← links)
- Consistent tuning parameter selection in high-dimensional group-penalized regression (Q2423857) (← links)
- Concordance and value information criteria for optimal treatment decision (Q2656587) (← links)
- Asymptotics of AIC, BIC and \(C_p\) model selection rules in high-dimensional regression (Q2676924) (← links)
- Information criteria for latent factor models: a study on factor pervasiveness and adaptivity (Q2688660) (← links)
- Representing Sparse Gaussian DAGs as Sparse R-Vines Allowing for Non-Gaussian Dependence (Q3391116) (← links)
- Tuning Parameter Selection in the LASSO with Unspecified Propensity (Q4556969) (← links)
- Identifying Latent Structures in Restricted Latent Class Models (Q4559708) (← links)
- Forward-Backward Selection with Early Dropping (Q4633015) (← links)
- (Q4633031) (← links)
- A study on tuning parameter selection for the high-dimensional lasso (Q4960728) (← links)
- Model Selection for High-Dimensional Quadratic Regression via Regularization (Q4962427) (← links)
- A Sparse Learning Approach to Relative-Volatility-Managed Portfolio Selection (Q4988547) (← links)
- (Q4998936) (← links)
- An Efficient Linearly Convergent Regularized Proximal Point Algorithm for Fused Multiple Graphical Lasso Problems (Q4999369) (← links)
- Nonparametric homogeneity pursuit in functional-coefficient models (Q5023851) (← links)
- Modeling association between multivariate correlated outcomes and high-dimensional sparse covariates: the adaptive SVS method (Q5036571) (← links)