Theory of statistical inference and information. Transl. from the Slovak by the author
From MaRDI portal
Publication:1210865
zbMath0711.62002MaRDI QIDQ1210865
Publication date: 5 June 1993
Published in: Theory and Decision Library. Series B: Mathematical and Statistical Methods (Search for Journal in Brave)
entropyinformationstatistical inferenceBayes riskconvex functions of measuresDistances of probability measuresKantorovich-Vasershtein distanceOrnstein distances
Introductory exposition (textbooks, tutorial papers, etc.) pertaining to statistics (62-01) Statistical aspects of information-theoretic topics (62B10)
Related Items (only showing first 100 items - show all)
A preliminary test in classification and probabilities of misclassification ⋮ Refinements of discrete and integral Jensen inequalities with Jensen's gap ⋮ On reconsidering entropies and divergences and their cumulative counterparts: Csiszár's, DPD's and Fisher's type cumulative and survival measures ⋮ On a cornerstone of bare-simulation distance/divergence optimization ⋮ Unnamed Item ⋮ Unnamed Item ⋮ Generalized Neyman-Pearson lemma via convex duality. ⋮ Generalized arithmetic and geometric mean divergence measure and their statistical aspects ⋮ A simulation study of a nested sequence of binomial regression models ⋮ Generalized arithmetic and geometric mean divergence measure and their statistical aspects ⋮ Unnamed Item ⋮ Unnamed Item ⋮ Unnamed Item ⋮ Unnamed Item ⋮ Unnamed Item ⋮ Unnamed Item ⋮ Unnamed Item ⋮ Generalized potential functions in differential geometry and information geometry ⋮ On local divergences between two probability measures ⋮ Unnamed Item ⋮ Extensions of recent combinatorial refinements of discrete and integral Jensen inequalities ⋮ The ϕ-divergence family of measures based on quantile function ⋮ Broadcasting on trees and the Ising model. ⋮ Conditions equivalent to consistency of approximate MLE's for stochastic processes ⋮ RAO'S STATISTIC FOR THE ANALYSIS OF UNIFORM ASSOCIATION IN CROSS-CLASSIFICATIONS ⋮ APPROXIMATIONS TO POWERS OF φ-DISPARITY GOODNESS-OF-FIT TESTS* ⋮ Estimation of \(f\)-divergence and Shannon entropy by Levinson-type inequalities via new Green's functions and Lidstone polynomial ⋮ Variable selection via penalized minimum φ-divergence estimation in logistic regression ⋮ Bounds for \(f\)-divergences under likelihood ratio constraints. ⋮ On convergence of Fisher informations in continuous models with quantized observations ⋮ On Generalized Tests of Fit for Multinomial Populations ⋮ Frameworks and results in distributionally robust optimization ⋮ Existence, consistency and computer simulation for selected variants of minimum distance estimators ⋮ On the asymptotics of minimum disparity estimation ⋮ A geometry on the space of probabilities. II: Projective spaces and exponential families ⋮ A refinement and an exact equality condition for the basic inequality of f-divergences ⋮ New converses of Jensen inequality via Green functions with applications ⋮ A generalization of \(f\)-divergence measure to convex functions defined on linear spaces ⋮ Reverses of the Jensen inequality in terms of first derivative and applications ⋮ Some inequalities for Csiszár divergence via theory of time scales ⋮ Convergence rates for empirical barycenters in metric spaces: curvature, convexity and extendable geodesics ⋮ Unnamed Item ⋮ Adaptive testing of multiple hypotheses for stochastic processes ⋮ Geometric characterization of Weyl's discrepancy norm in terms of its \(n\)-dimensional unit balls ⋮ On the symmetrized \(s\)-divergence ⋮ General Lebesgue integral inequalities of Jensen and Ostrowski type for differentiable functions whose derivatives in absolute value are h-convex and applications ⋮ Estimations of \(f\)- and Rényi divergences by using a cyclic refinement of the Jensen's inequality ⋮ Aggregation of estimators and stochastic optimization ⋮ Local equivalences of distances between clusterings -- a geometric perspective ⋮ Robust Statistical Engineering by Means of Scaled Bregman Distances ⋮ Analysis of \(\varPhi\)-divergence for loglinear models with constraints under product-multinomial sampling ⋮ Generalization of the Kullback-Leibler divergence in the Tsallis statistics ⋮ Divergence-based confidence intervals in false-positive misclassification model ⋮ On the empirical estimation of integral probability metrics ⋮ Distances and discrimination rates for stochastic processes ⋮ Unnamed Item ⋮ Geometry of the probability simplex and its connection to the maximum entropy method ⋮ Robust estimation for non-homogeneous data and the selection of the optimal tuning parameter: the density power divergence approach ⋮ Optimal Sampling of Parametric Families: Implications for Machine Learning ⋮ Minimum Kolmogorov Distance Estimates for Multivariate Parametrized Families ⋮ Tests of goodness of fit based on Phi-divergence ⋮ Rényi divergence and the central limit theorem ⋮ \((R,S)\)-information radius of type \(t\) and comparison of experiments ⋮ Jensen-Ostrowski type inequalities and applications for \(f\)-divergence measures ⋮ On testing local hypotheses via local divergence ⋮ Testing hypotheses in truncated samples by means of divergence statistics ⋮ Goodness-of-fit tests via phi-divergences ⋮ Asymptotic expansion of the risk of maximum likelihood estimator with respect to α-divergence ⋮ Change-Point Estimation as a Nonlinear Regression Problem ⋮ Minimum and Maximum Information Censoring Plans in Progressive Censoring ⋮ An \((R',S')\)-norm fuzzy relative information measure and its applications in strategic decision-making ⋮ Choosing the best Rukhin goodness-of-fit statistics ⋮ Lower Bound on the Error Probability for Families with Bounded Likelihood Ratios ⋮ Rényi statistics for testing equality of autocorrelation coefficients ⋮ Adaptive tests for stochastic processes in the ergodic case ⋮ Minimum disparity estimators for discrete and continuous models. ⋮ Distance Measures and Applications to Multimodal Variational Imaging ⋮ A Survey of Reverse Inequalities for f-Divergence Measure in Information Theory ⋮ LATTICE-VALUED POSSIBILISTIC ENTROPY MEASURE ⋮ Stein-type estimation in logistic regression models based on minimum \(\phi\)-divergence estimators ⋮ Auto-association measures for stationary time series of categorical data ⋮ Unnamed Item ⋮ Limit laws for disparities of spacings ⋮ Minimum disparity estimation: improved efficiency through inlier modification ⋮ Limit Theorems for φ-Divergences Based onk-Spacings ⋮ Asymptotic behaviour and statistical applications of divergence measures in multinomial populations: A unified study ⋮ On efficiency of estimation and testing with data quantized to fixed number of cells ⋮ Some Universal Insights on Divergences for Statistics, Machine Learning and Artificial Intelligence ⋮ Levinson-type inequalities via new Green functions and Montgomery identity ⋮ An extension of likelihood-ratio-test for testing linear hypotheses in the baseline-category logit model ⋮ About the sharpness of the Jensen inequality ⋮ Gene expression and protein-protein interaction data for identification of colon cancer related genes using \(f\)-information measures ⋮ Estimation of a continuous distribution on the real line by discretization methods ⋮ Admissible kernels for RKHS embedding of probability distributions ⋮ Unnamed Item ⋮ Unnamed Item ⋮ RETRACTED ARTICLE: Generalization of the Levinson inequality with applications to information theory ⋮ Generalization of the Levinson inequality with applications to information theory ⋮ Estimation of f-divergence and Shannon entropy by using Levinson type inequalities for higher order convex functions via Hermite interpolating polynomial ⋮ Refinements of the integral Jensen's inequality generated by finite or infinite permutations
This page was built for publication: Theory of statistical inference and information. Transl. from the Slovak by the author