Projection theorems and estimating equations for power-law models
From MaRDI portal
Publication:2034450
DOI10.1016/J.JMVA.2021.104734zbMATH Open1467.62086arXiv1905.01434OpenAlexW3131289517WikidataQ113870431 ScholiaQ113870431MaRDI QIDQ2034450FDOQ2034450
Publication date: 22 June 2021
Published in: Journal of Multivariate Analysis (Search for Journal in Brave)
Abstract: We extend projection theorems concerning Hellinger and Jones et al. divergences to the continuous case. These projection theorems reduce certain estimation problems on generalized exponential models to linear problems. We introduce the notion of regularity for generalized exponential models and show that the projection theorems in this case are similar to the ones in discrete and canonical case. We also apply these ideas to solve certain estimation problems concerning Student and Cauchy distributions.
Full work available at URL: https://arxiv.org/abs/1905.01434
estimating equationdivergenceCauchy distributionStudent distributionprojection theorempower-law family
Cites Work
- Title not available (Why is that?)
- Title not available (Why is that?)
- Title not available (Why is that?)
- Title not available (Why is that?)
- Title not available (Why is that?)
- Title not available (Why is that?)
- Title not available (Why is that?)
- Title not available (Why is that?)
- Title not available (Why is that?)
- Title not available (Why is that?)
- Title not available (Why is that?)
- Title not available (Why is that?)
- Why least squares and maximum entropy? An axiomatic approach to inference for linear inverse problems
- Possible generalization of Boltzmann-Gibbs statistics.
- Information Theoretic Learning
- Papers on probability, statistics and statistical physics. Ed. by R. D. Rosenkrantz
- I-divergence geometry of probability distributions and minimization problems
- Minimum Hellinger distance estimates for parametric models
- A comparison of related density-based minimum divergence estimators
- Robust Blind Source Separation by Beta Divergence
- Introduction to Nonextensive Statistical Mechanics
- Minimum Hellinger Distance Estimation for Multivariate Location and Covariance
- Robust and efficient estimation by minimising a density power divergence
- Information projections revisited
- Statistical Inference
- Information Theory and Statistics: A Tutorial
- Robust parameter estimation with a small bias against heavy contamination
- CramÉr–Rao and Moment-Entropy Inequalities for Renyi Entropy and Generalized Fisher Information
- Decomposable pseudodistances and applications in statistical estimation
- Dual divergence estimators and tests: robustness results
- Information geometry ofq-Gaussian densities and behaviors of solutions to related diffusion equations
- Projective power entropy and maximum Tsallis entropy distributions
- Evaluation of the maximum-likelihood estimator where the likelihood equation has multiple roots
- Families of alpha-, beta- and gamma-divergences: flexible and robust measures of similarities
- Generalized cutoff rates and Renyi's information measures
- On the distribution of the roots of certain symmetric matrices
- Sanov property, generalized I-projection and a conditional limit theorem
- Some results concerning maximum Rényi entropy distributions
- Generalized projections for non-negative functions
- Parametric estimation and tests through divergences and the duality technique
- The power divergence and the density power divergence families: the mathematical connection
- Minimization Problems Based on Relative <inline-formula> <tex-math notation="LaTeX">$\alpha $ </tex-math></inline-formula>-Entropy I: Forward Projection
- Several applications of divergence criteria in continuous families
- Minimum divergence estimators, maximum likelihood and exponential families
- Central limit theorem and deformed exponentials
- Newton-based stochastic optimization using \(q\)-Gaussian smoothed functional algorithms
- Robust Estimation: A Weighted Maximum Likelihood Approach
- Generalized Cauchy distributions
- The logarithmic super divergence and asymptotic inference properties
- On multivariate truncated generalized Cauchy distribution
- Towards a better understanding of the dual representation of phi divergences
- Projection Theorems for the Rényi Divergence on $\alpha $ -Convex Sets
- Minimization Problems Based on Relative <inline-formula> <tex-math notation="LaTeX">$\alpha $ </tex-math></inline-formula>-Entropy II: Reverse Projection
- Encoding Tasks and Rényi Entropy
- Guessing Under Source Uncertainty
- Generalized minimizers of convex integral functionals, Bregman distance, Pythagorean identities
Cited In (4)
- Conformal mirror descent with logarithmic divergences
- Computing statistical divergences with sigma points
- Power-law Lévy processes, power-law vector random fields, and some extensions
- A unified approach to the Pythagorean identity and projection theorem for a class of divergences based on M-estimations
This page was built for publication: Projection theorems and estimating equations for power-law models
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2034450)