Optimal model selection for density estimation of stationary data under various mixing condi\-tions
From MaRDI portal
(Redirected from Publication:651014)
Abstract: We propose a block-resampling penalization method for marginal density estimation with nonnecessary independent observations. When the data are or -mixing, the selected estimator satisfies oracle inequalities with leading constant asymptotically equal to 1. We also prove in this setting the slope heuristic, which is a data-driven method to optimize the leading constant in the penalty.
Recommendations
- Optimal model selection in density estimation
- Directional mixture models and optimal estimation of the mixing density
- On Construction and Estimation of Stationary Mixture Transition Distribution Models
- A quick procedure for model selection in the case of mixture of normal densities
- Model selection for mixture models -- perspectives and strategies
- Bayesian density estimation and model selection using nonparametric hierarchical mixtures
- Maximum likelihood estimation and model selection for locally stationary processes∗
Cites work
- scientific article; zbMATH DE number 3692406 (Why is no real title available?)
- scientific article; zbMATH DE number 3789676 (Why is no real title available?)
- scientific article; zbMATH DE number 1064642 (Why is no real title available?)
- scientific article; zbMATH DE number 854585 (Why is no real title available?)
- A Bennett concentration inequality and its application to suprema of empirical processes
- Adaptive density deconvolution with dependent inputs
- Adaptive density estimation of stationary \(\beta\)-mixing and \(\tau\)-mixing processes
- Adaptive density estimation under weak dependence
- Adaptive estimation in autoregression or \(\beta\)-mixing regression via model selection
- Concentration around the mean for maxima of empirical processes
- Density estimation by wavelet thresholding
- Introduction to strong mixing conditions. Vol. 1.
- Minimal penalties for Gaussian model selection
- Mixing: Properties and examples
- Model selection by resampling penalization
- New dependence coefficients. Examples and applications to statistics
- Non-strong mixing autoregressive processes
- Nonparametric estimation of the stationary density and the transition density of a Markov chain
- Optimal model selection for density estimation of stationary data under various mixing condi\-tions
- Risk bounds for model selection via penalization
- Risk bounds for statistical learning
- Some Limit Theorems for Random Functions. I
- The jackknife and the bootstrap for general stationary observations
- Weak dependence. With examples and applications.
Cited in
(12)- Piecewise autoregression for general integer-valued time series
- Data-driven model selection for same-realization predictions in autoregressive processes
- Exponential inequalities for nonstationary Markov chains
- Optimal model selection in density estimation
- Slope heuristics and V-fold model selection in heteroscedastic regression using strongly localized bases
- Adaptive directional estimator of the density in \(\mathbb{R}^d\) for independent and mixing sequences
- Consistent model selection criteria and goodness-of-fit test for common time series models
- Adaptive estimation for stochastic damping Hamiltonian systems under partial observation
- Gaussian linear model selection in a dependent context
- Sharp oracle inequalities and slope heuristic for specification probabilities estimation in discrete random fields
- Optimal model selection for density estimation of stationary data under various mixing condi\-tions
- Optimal model selection in heteroscedastic regression using piecewise polynomial functions
This page was built for publication: Optimal model selection for density estimation of stationary data under various mixing condi\-tions
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q651014)