Penalized Maximum Likelihood Principle for Choosing Ridge Parameter
From MaRDI portal
Publication:3652709
DOI10.1080/03610910903061014zbMath1191.62125OpenAlexW1976381939MaRDI QIDQ3652709
Publication date: 16 December 2009
Published in: Communications in Statistics - Simulation and Computation (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1080/03610910903061014
Lua error in Module:PublicationMSCList at line 37: attempt to index local 'msc_result' (a nil value).
Related Items (5)
Computational Method for Jackknifed Generalized Ridge Tuning Parameter based on Generalized Maximum Entropy ⋮ The Loss Rank Criterion for Variable Selection in Linear Regression Analysis ⋮ Efficient approximate k‐fold and leave‐one‐out cross‐validation for ridge regression ⋮ On the Choice of the Ridge Parameter: A Maximum Entropy Approach ⋮ Bias-correction for Weibull common shape estimation
Cites Work
- Smoothing noisy data with spline functions: Estimating the correct degree of smoothing by the method of generalized cross-validation
- Some Modifications for Choosing Ridge Parameters
- A Monte Carlo Study of Recent Ridge Parameters
- Choosing Ridge Parameter for Regression Problems
- The Loss Rank Principle for Model Selection
- Tuning parameter selectors for the smoothly clipped absolute deviation method
- Unnamed Item
- Unnamed Item
- Unnamed Item
This page was built for publication: Penalized Maximum Likelihood Principle for Choosing Ridge Parameter