Consistency and generalization bounds for maximum entropy density estimation
From MaRDI portal
Publication:280753
DOI10.3390/e15125439zbMath1345.62024OpenAlexW2093541146MaRDI QIDQ280753
Shaojun Wang, Russell Greiner, Shaomin Wang
Publication date: 10 May 2016
Published in: Entropy (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.3390/e15125439
Related Items
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Approximation of density functions by sequences of exponential families
- I-divergence geometry of probability distributions and minimization problems
- Convex analysis and nonlinear optimization. Theory and examples
- Empirical margin distributions and bounding the generalization error of combined classifiers
- Statistical behavior and consistency of classification methods based on convex risk minimization.
- Weak convergence and empirical processes. With applications to statistics
- A mathematical theory of communication
- Boosting with early stopping: convergence and consistency
- Information Theory and Statistical Mechanics
- Risk bounds for mixture density estimation
- The EM Algorithm and Extensions, 2E
- Graphical Models, Exponential Families, and Variational Inference
- Uniform Central Limit Theorems
- 10.1162/153244302760200713
- Leave-One-Out Bounds for Kernel Methods
- 10.1162/1532443041424300