Mixing strategies for density estimation.

From MaRDI portal
Publication:1848770

DOI10.1214/aos/1016120365zbMath1106.62322OpenAlexW2011283296MaRDI QIDQ1848770

Yuhong Yang

Publication date: 14 November 2002

Published in: The Annals of Statistics (Search for Journal in Brave)

Full work available at URL: https://projecteuclid.org/euclid.aos/1016120365



Related Items

Bandwidth selection for kernel density estimation: a review of fully automatic selectors, Performance of empirical risk minimization in linear aggregation, Online forecast combinations of distributions: worst case bounds, Sparsity in penalized empirical risk minimization, A general procedure to combine estimators, Aggregation of predictors for nonstationary sub-linear processes and online adaptive forecasting of time varying autoregressive processes, Convex aggregative modelling of infinite memory nonlinear systems, Aggregation of estimators and stochastic optimization, On the optimality of the empirical risk minimization procedure for the convex aggregation problem, Empirical risk minimization is optimal for the convex aggregation problem, Oracle inequalities for cross-validation type procedures, On the optimality of the aggregate with exponential weights for low temperatures, Least squares model averaging for two non-nested linear models, Kullback-Leibler aggregation and misspecified generalized linear models, Estimator selection with respect to Hellinger-type risks, Adaptively combined forecasting for discrete response time series, Model selection for density estimation with \(\mathbb L_2\)-loss, Optimal learning with \textit{Q}-aggregation, Aggregation of spectral density estimators, Simultaneous adaptation to the margin and to complexity in classification, Catching up Faster by Switching Sooner: A Predictive Approach to Adaptive Estimation with an Application to the AIC–BIC Dilemma, Optimal Kullback-Leibler aggregation in mixture density estimation by maximum likelihood, Fast adaptive estimation of log-additive exponential models in Kullback-Leibler divergence, Optimal rates of aggregation in classification under low noise assumption, Unnamed Item, Learning by mirror averaging, Estimator selection in the Gaussian setting, Averaging of density kernel estimators, A universal procedure for aggregating estimators, Mixing least-squares estimators when the variance is unknown, Linear and convex aggregation of density estimators, On the exponentially weighted aggregate with the Laplace prior, Localized Gaussian width of \(M\)-convex hulls with applications to Lasso and convex aggregation, Aggregating estimates by convex optimization, Distribution-free robust linear regression



Cites Work