The following pages link to Gaussian model selection (Q5945247):
Displayed 50 items.
- Statistical inference for the optimal approximating model (Q1950381) (← links)
- Minimax risks for sparse regressions: ultra-high dimensional phenomenons (Q1950804) (← links)
- Spatial adaptation in heteroscedastic regression: propagation approach (Q1950843) (← links)
- Selecting the length of a principal curve within a Gaussian model (Q1951117) (← links)
- Model selection in regression under structural constraints (Q1951123) (← links)
- Optimal model selection in heteroscedastic regression using piecewise polynomial functions (Q1951154) (← links)
- Adaptive estimation of linear functionals by model selection (Q1951783) (← links)
- Simultaneous estimation of the mean and the variance in heteroscedastic Gaussian regression (Q1951804) (← links)
- Model selection by resampling penalization (Q1951992) (← links)
- MAP model selection in Gaussian regression (Q1952087) (← links)
- The Lasso as an \(\ell _{1}\)-ball model selection procedure (Q1952205) (← links)
- Sparsity considerations for dependent variables (Q1952207) (← links)
- Model selection and sharp asymptotic minimaxity (Q1955840) (← links)
- The Goldenshluger-Lepski method for constrained least-squares estimators over RKHSs (Q1983602) (← links)
- Multiple change-points detection by empirical Bayesian information criteria and Gibbs sampling induced stochastic search (Q1984867) (← links)
- Tail-greedy bottom-up data decompositions and fast multiple change-point detection (Q1990585) (← links)
- Empirical Bayes oracle uncertainty quantification for regression (Q1996760) (← links)
- A MOM-based ensemble method for robustness, subsampling and hyperparameter tuning (Q2044333) (← links)
- Consistency of a range of penalised cost approaches for detecting multiple changepoints (Q2084454) (← links)
- Detecting possibly frequent change-points: wild binary segmentation 2 and steepest-drop model selection (Q2131951) (← links)
- GSDAR: a fast Newton algorithm for \(\ell_0\) regularized generalized linear models with statistical guarantee (Q2135875) (← links)
- Semiparametric inference for mixtures of circular data (Q2154957) (← links)
- How can we identify the sparsity structure pattern of high-dimensional data: an elementary statistical analysis to interpretable machine learning (Q2170515) (← links)
- Empirical risk minimization as parameter choice rule for general linear regularization methods (Q2179243) (← links)
- Estimating piecewise monotone signals (Q2180071) (← links)
- Consistent model selection criteria and goodness-of-fit test for common time series models (Q2180087) (← links)
- On estimation of isotonic piecewise constant signals (Q2196185) (← links)
- Model selection: from theory to practice (Q2197389) (← links)
- A general framework for Bayes structured linear models (Q2215762) (← links)
- Gaussian linear model selection in a dependent context (Q2233592) (← links)
- Bayesian model selection and the concentration of the posterior of hyperparameters (Q2259290) (← links)
- Maxisets for model selection (Q2267395) (← links)
- On signal reconstruction in white noise using dictionaries (Q2269367) (← links)
- Spike and slab empirical Bayes sparse credible sets (Q2278657) (← links)
- Needles and straw in a haystack: robust confidence for possibly sparse sequences (Q2278660) (← links)
- A breakpoint detection in the mean model with heterogeneous variance on fixed time intervals (Q2302484) (← links)
- Estimation and model selection for model-based clustering with the conditional classification likelihood (Q2346523) (← links)
- Adaptive estimation over anisotropic functional classes via oracle approach (Q2352739) (← links)
- Minimal penalties for Gaussian model selection (Q2369862) (← links)
- A penalized criterion for variable selection in classification (Q2370521) (← links)
- Statistical estimation with model selection (Q2385788) (← links)
- General maximum likelihood empirical Bayes estimation of normal means (Q2388976) (← links)
- Oracle convergence rate of posterior under projection prior and Bayesian model selection (Q2437894) (← links)
- Estimating composite functions by model selection (Q2438264) (← links)
- Sparse PCA: optimal rates and adaptive estimation (Q2443213) (← links)
- Estimation and variable selection with exponential weights (Q2447091) (← links)
- Sparse model selection under heterogeneous noise: exact penalisation and data-driven thresholding (Q2447094) (← links)
- Model selection for density estimation with \(\mathbb L_2\)-loss (Q2447292) (← links)
- Aggregation for Gaussian regression (Q2456016) (← links)
- On optimality of Bayesian testimation in the normal means problem (Q2466690) (← links)