Information-theoretic upper and lower bounds for statistical estimation
From MaRDI portal
Publication:3547876
DOI10.1109/TIT.2005.864439zbMATH Open1320.94033OpenAlexW2130797782MaRDI QIDQ3547876FDOQ3547876
Authors:
Publication date: 21 December 2008
Published in: IEEE Transactions on Information Theory (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1109/tit.2005.864439
Recommendations
Cited In (42)
- Fano's inequality for random variables
- The benefit of group sparsity
- Title not available (Why is that?)
- Title not available (Why is that?)
- Learning Theory
- Fast learning rates in statistical inference through aggregation
- Gibbs posterior for variable selection in high-dimensional classification and data mining
- A note on some algorithms for the Gibbs posterior
- Gibbs posterior inference on value-at-risk
- Using the doubling dimension to analyze the generalization of learning algorithms
- Title not available (Why is that?)
- Risk minimization for time series binary choice with variable selection
- Learning with Limited Samples: Meta-Learning and Applications to Communication Systems
- Minimax rates for conditional density estimation via empirical entropy
- On the properties of variational approximations of Gibbs posteriors
- Aggregation by exponential weighting, sharp PAC-Bayesian bounds and sparsity
- Adaptive Bayesian density estimation with location-scale mixtures
- Sparse recovery in convex hulls via entropy penalization
- A Bayesian approach to (online) transfer learning: theory and algorithms
- Bayesian inference on volatility in the presence of infinite jump activity and microstructure noise
- Joint production in stochastic non-parametric envelopment of data with firm-specific directions
- Quasi-Bayesian analysis of nonparametric instrumental variables models
- Adaptive variable selection for sequential prediction in multivariate dynamic models
- Minimum description length revisited
- Contextuality of misspecification and data-dependent losses
- GENERAL INEQUALITIES FOR GIBBS POSTERIOR WITH NONADDITIVE EMPIRICAL RISK
- Gibbs posterior inference on multivariate quantiles
- Model-free posterior inference on the area under the receiver operating characteristic curve
- Robust and rate-optimal Gibbs posterior inference on the boundary of a noisy image
- Posterior concentration and fast convergence rates for generalized Bayesian learning
- On general Bayesian inference using loss functions
- Predicting Panel Data Binary Choice with the Gibbs Posterior
- `Purposely misspecified' posterior inference on the volatility of a jump diffusion process
- Robust posterior inference for Youden’s index cutoff
- Gibbs posterior concentration rates under sub-exponential type losses
- High-dimensional sparse classification using exponential weighting with empirical hinge loss
- Approximating Bayes in the 21st century
- Title not available (Why is that?)
- Gaussian variational approximations for high-dimensional state space models
- Gibbs posterior convergence and the thermodynamic formalism
- Model misspecification, Bayesian versus credibility estimation, and Gibbs posteriors
- User-friendly Introduction to PAC-Bayes Bounds
This page was built for publication: Information-theoretic upper and lower bounds for statistical estimation
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q3547876)