Information-theoretic upper and lower bounds for statistical estimation

From MaRDI portal
Publication:3547876

DOI10.1109/TIT.2005.864439zbMath1320.94033OpenAlexW2130797782MaRDI QIDQ3547876

No author found.

Publication date: 21 December 2008

Published in: IEEE Transactions on Information Theory (Search for Journal in Brave)

Full work available at URL: https://doi.org/10.1109/tit.2005.864439




Related Items

Learning with Limited Samples: Meta-Learning and Applications to Communication SystemsFast learning rates in statistical inference through aggregationUnnamed ItemGENERAL INEQUALITIES FOR GIBBS POSTERIOR WITH NONADDITIVE EMPIRICAL RISKPosterior concentration and fast convergence rates for generalized Bayesian learningModel-free posterior inference on the area under the receiver operating characteristic curveA Bayesian approach to (online) transfer learning: theory and algorithmsRobust posterior inference for Youden’s index cutoffMinimax rates for conditional density estimation via empirical entropyUser-friendly Introduction to PAC-Bayes BoundsRobust and rate-optimal Gibbs posterior inference on the boundary of a noisy imageJoint production in stochastic non-parametric envelopment of data with firm-specific directions`Purposely misspecified' posterior inference on the volatility of a jump diffusion processAdaptive variable selection for sequential prediction in multivariate dynamic modelsGibbs posterior concentration rates under sub-exponential type lossesAdaptive Bayesian density estimation with location-scale mixturesModel misspecification, Bayesian versus credibility estimation, and Gibbs posteriorsQuasi-Bayesian analysis of nonparametric instrumental variables modelsFano's inequality for random variablesUsing the doubling dimension to analyze the generalization of learning algorithmsBayesian inference on volatility in the presence of infinite jump activity and microstructure noiseGibbs posterior for variable selection in high-dimensional classification and data miningRISK MINIMIZATION FOR TIME SERIES BINARY CHOICE WITH VARIABLE SELECTIONA note on some algorithms for the Gibbs posteriorUnnamed ItemThe benefit of group sparsityContextuality of misspecification and data-dependent lossesAggregation by exponential weighting, sharp PAC-Bayesian bounds and sparsitySparse recovery in convex hulls via entropy penalizationGibbs posterior inference on multivariate quantilesGibbs posterior inference on value-at-riskOn general Bayesian inference using loss functionsOn the properties of variational approximations of Gibbs posteriorsMinimum description length revisitedPredicting Panel Data Binary Choice with the Gibbs PosteriorUnnamed ItemGibbs posterior convergence and the thermodynamic formalism