Theory of statistical inference and information. Transl. from the Slovak by the author

From MaRDI portal
Revision as of 06:25, 31 January 2024 by Import240129110113 (talk | contribs) (Created automatically from import240129110113)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)

Publication:1210865

zbMath0711.62002MaRDI QIDQ1210865

Igor Vajda

Publication date: 5 June 1993

Published in: Theory and Decision Library. Series B: Mathematical and Statistical Methods (Search for Journal in Brave)




Related Items (only showing first 100 items - show all)

A preliminary test in classification and probabilities of misclassificationRefinements of discrete and integral Jensen inequalities with Jensen's gapOn reconsidering entropies and divergences and their cumulative counterparts: Csiszár's, DPD's and Fisher's type cumulative and survival measuresOn a cornerstone of bare-simulation distance/divergence optimizationUnnamed ItemUnnamed ItemGeneralized Neyman-Pearson lemma via convex duality.Generalized arithmetic and geometric mean divergence measure and their statistical aspectsA simulation study of a nested sequence of binomial regression modelsGeneralized arithmetic and geometric mean divergence measure and their statistical aspectsUnnamed ItemUnnamed ItemUnnamed ItemUnnamed ItemUnnamed ItemUnnamed ItemUnnamed ItemGeneralized potential functions in differential geometry and information geometryOn local divergences between two probability measuresUnnamed ItemExtensions of recent combinatorial refinements of discrete and integral Jensen inequalitiesThe ϕ-divergence family of measures based on quantile functionBroadcasting on trees and the Ising model.Conditions equivalent to consistency of approximate MLE's for stochastic processesRAO'S STATISTIC FOR THE ANALYSIS OF UNIFORM ASSOCIATION IN CROSS-CLASSIFICATIONSAPPROXIMATIONS TO POWERS OF φ-DISPARITY GOODNESS-OF-FIT TESTS*Estimation of \(f\)-divergence and Shannon entropy by Levinson-type inequalities via new Green's functions and Lidstone polynomialVariable selection via penalized minimum φ-divergence estimation in logistic regressionBounds for \(f\)-divergences under likelihood ratio constraints.On convergence of Fisher informations in continuous models with quantized observationsOn Generalized Tests of Fit for Multinomial PopulationsFrameworks and results in distributionally robust optimizationExistence, consistency and computer simulation for selected variants of minimum distance estimatorsOn the asymptotics of minimum disparity estimationA geometry on the space of probabilities. II: Projective spaces and exponential familiesA refinement and an exact equality condition for the basic inequality of f-divergencesNew converses of Jensen inequality via Green functions with applicationsA generalization of \(f\)-divergence measure to convex functions defined on linear spacesReverses of the Jensen inequality in terms of first derivative and applicationsSome inequalities for Csiszár divergence via theory of time scalesConvergence rates for empirical barycenters in metric spaces: curvature, convexity and extendable geodesicsUnnamed ItemAdaptive testing of multiple hypotheses for stochastic processesGeometric characterization of Weyl's discrepancy norm in terms of its \(n\)-dimensional unit ballsOn the symmetrized \(s\)-divergenceGeneral Lebesgue integral inequalities of Jensen and Ostrowski type for differentiable functions whose derivatives in absolute value are h-convex and applicationsEstimations of \(f\)- and Rényi divergences by using a cyclic refinement of the Jensen's inequalityAggregation of estimators and stochastic optimizationLocal equivalences of distances between clusterings -- a geometric perspectiveRobust Statistical Engineering by Means of Scaled Bregman DistancesAnalysis of \(\varPhi\)-divergence for loglinear models with constraints under product-multinomial samplingGeneralization of the Kullback-Leibler divergence in the Tsallis statisticsDivergence-based confidence intervals in false-positive misclassification modelOn the empirical estimation of integral probability metricsDistances and discrimination rates for stochastic processesUnnamed ItemGeometry of the probability simplex and its connection to the maximum entropy methodRobust estimation for non-homogeneous data and the selection of the optimal tuning parameter: the density power divergence approachOptimal Sampling of Parametric Families: Implications for Machine LearningMinimum Kolmogorov Distance Estimates for Multivariate Parametrized FamiliesTests of goodness of fit based on Phi-divergenceRényi divergence and the central limit theorem\((R,S)\)-information radius of type \(t\) and comparison of experimentsJensen-Ostrowski type inequalities and applications for \(f\)-divergence measuresOn testing local hypotheses via local divergenceTesting hypotheses in truncated samples by means of divergence statisticsGoodness-of-fit tests via phi-divergencesAsymptotic expansion of the risk of maximum likelihood estimator with respect to α-divergenceChange-Point Estimation as a Nonlinear Regression ProblemMinimum and Maximum Information Censoring Plans in Progressive CensoringAn \((R',S')\)-norm fuzzy relative information measure and its applications in strategic decision-makingChoosing the best Rukhin goodness-of-fit statisticsLower Bound on the Error Probability for Families with Bounded Likelihood RatiosRényi statistics for testing equality of autocorrelation coefficientsAdaptive tests for stochastic processes in the ergodic caseMinimum disparity estimators for discrete and continuous models.Distance Measures and Applications to Multimodal Variational ImagingA Survey of Reverse Inequalities for f-Divergence Measure in Information TheoryLATTICE-VALUED POSSIBILISTIC ENTROPY MEASUREStein-type estimation in logistic regression models based on minimum \(\phi\)-divergence estimatorsAuto-association measures for stationary time series of categorical dataUnnamed ItemLimit laws for disparities of spacingsMinimum disparity estimation: improved efficiency through inlier modificationLimit Theorems for φ-Divergences Based onk-SpacingsAsymptotic behaviour and statistical applications of divergence measures in multinomial populations: A unified studyOn efficiency of estimation and testing with data quantized to fixed number of cellsSome Universal Insights on Divergences for Statistics, Machine Learning and Artificial IntelligenceLevinson-type inequalities via new Green functions and Montgomery identityAn extension of likelihood-ratio-test for testing linear hypotheses in the baseline-category logit modelAbout the sharpness of the Jensen inequalityGene expression and protein-protein interaction data for identification of colon cancer related genes using \(f\)-information measuresEstimation of a continuous distribution on the real line by discretization methodsAdmissible kernels for RKHS embedding of probability distributionsUnnamed ItemUnnamed ItemRETRACTED ARTICLE: Generalization of the Levinson inequality with applications to information theoryGeneralization of the Levinson inequality with applications to information theoryEstimation of f-divergence and Shannon entropy by using Levinson type inequalities for higher order convex functions via Hermite interpolating polynomialRefinements of the integral Jensen's inequality generated by finite or infinite permutations







This page was built for publication: Theory of statistical inference and information. Transl. from the Slovak by the author