Projection theorems and estimating equations for power-law models
From MaRDI portal
Publication:2034450
Abstract: We extend projection theorems concerning Hellinger and Jones et al. divergences to the continuous case. These projection theorems reduce certain estimation problems on generalized exponential models to linear problems. We introduce the notion of regularity for generalized exponential models and show that the projection theorems in this case are similar to the ones in discrete and canonical case. We also apply these ideas to solve certain estimation problems concerning Student and Cauchy distributions.
Recommendations
- A projection method of estimation for a subfamily of exponential families
- Information projections revisited
- A minimax result for divergence projections on locally convex sets of measures
- Divergences and duality for estimation and test under moment condition models
- scientific article; zbMATH DE number 4113731
Cites work
- scientific article; zbMATH DE number 3173999 (Why is no real title available?)
- scientific article; zbMATH DE number 3872359 (Why is no real title available?)
- scientific article; zbMATH DE number 3911472 (Why is no real title available?)
- scientific article; zbMATH DE number 44577 (Why is no real title available?)
- scientific article; zbMATH DE number 3567782 (Why is no real title available?)
- scientific article; zbMATH DE number 1220667 (Why is no real title available?)
- scientific article; zbMATH DE number 1240598 (Why is no real title available?)
- scientific article; zbMATH DE number 788228 (Why is no real title available?)
- scientific article; zbMATH DE number 795297 (Why is no real title available?)
- scientific article; zbMATH DE number 2218505 (Why is no real title available?)
- scientific article; zbMATH DE number 2221907 (Why is no real title available?)
- scientific article; zbMATH DE number 3103174 (Why is no real title available?)
- A comparison of related density-based minimum divergence estimators
- Central limit theorem and deformed exponentials
- CramÉr–Rao and Moment-Entropy Inequalities for Renyi Entropy and Generalized Fisher Information
- Decomposable pseudodistances and applications in statistical estimation
- Dual divergence estimators and tests: robustness results
- Encoding Tasks and Rényi Entropy
- Evaluation of the maximum-likelihood estimator where the likelihood equation has multiple roots
- Families of alpha-, beta- and gamma-divergences: flexible and robust measures of similarities
- Generalized Cauchy distributions
- Generalized cutoff rates and Renyi's information measures
- Generalized minimizers of convex integral functionals, Bregman distance, Pythagorean identities
- Generalized projections for non-negative functions
- Guessing Under Source Uncertainty
- I-divergence geometry of probability distributions and minimization problems
- Information Theory and Statistics: A Tutorial
- Information geometry of \(q\)-Gaussian densities and behaviors of solutions to related diffusion equations
- Information projections revisited
- Information theoretic learning. Renyi's entropy and kernel perspectives
- Introduction to Nonextensive Statistical Mechanics
- Minimization Problems Based on Relative <inline-formula> <tex-math notation="LaTeX">$\alpha $ </tex-math></inline-formula>-Entropy I: Forward Projection
- Minimization Problems Based on Relative <inline-formula> <tex-math notation="LaTeX">$\alpha $ </tex-math></inline-formula>-Entropy II: Reverse Projection
- Minimum Hellinger Distance Estimation for Multivariate Location and Covariance
- Minimum Hellinger distance estimates for parametric models
- Minimum divergence estimators, maximum likelihood and exponential families
- Newton-based stochastic optimization using \(q\)-Gaussian smoothed functional algorithms
- On multivariate truncated generalized Cauchy distribution
- On the distribution of the roots of certain symmetric matrices
- Papers on probability, statistics and statistical physics. Ed. by R. D. Rosenkrantz
- Parametric estimation and tests through divergences and the duality technique
- Possible generalization of Boltzmann-Gibbs statistics.
- Projection Theorems for the Rényi Divergence on $\alpha $ -Convex Sets
- Projective power entropy and maximum Tsallis entropy distributions
- Robust Blind Source Separation by Beta Divergence
- Robust Estimation: A Weighted Maximum Likelihood Approach
- Robust and efficient estimation by minimising a density power divergence
- Robust parameter estimation with a small bias against heavy contamination
- Sanov property, generalized I-projection and a conditional limit theorem
- Several applications of divergence criteria in continuous families
- Some results concerning maximum Rényi entropy distributions
- Statistical Inference
- The logarithmic super divergence and asymptotic inference properties
- The power divergence and the density power divergence families: the mathematical connection
- Towards a better understanding of the dual representation of phi divergences
- Why least squares and maximum entropy? An axiomatic approach to inference for linear inverse problems
Cited in
(4)- A unified approach to the Pythagorean identity and projection theorem for a class of divergences based on M-estimations
- Computing statistical divergences with sigma points
- Power-law Lévy processes, power-law vector random fields, and some extensions
- Conformal mirror descent with logarithmic divergences
This page was built for publication: Projection theorems and estimating equations for power-law models
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2034450)