scientific article
From MaRDI portal
Publication:3292859
zbMath0106.33001MaRDI QIDQ3292859
Publication date: 1961
Title: zbMATH Open Web Interface contents unavailable due to conflicting licenses.
information measuresentropy measuresinformation-theoretical proof of a limit theorem on Markov chainsmean value related to a Kolmogorov-Nagumo functionmean-value property of entropy
Measures of information, entropy (94A17) Statistical aspects of information-theoretic topics (62B10) Limit theorems in probability theory (60F99)
Related Items (only showing first 100 items - show all)
Monotonicity of the unified quantum \((r,s)\)-entropy and \((r,s)\)-mutual information ⋮ A new generalized Weibull family of distributions: mathematical properties and applications ⋮ Optimal quantization of the support of a continuous multivariate distribution based on mutual information ⋮ Charged Rényi entropies in CFTs with Einstein-Gauss-Bonnet holographic duals ⋮ LREE of a dynamical unstable D\(p\)-brane ⋮ A new upside-down bathtub shaped hazard rate model for survival data analysis ⋮ Sherman's and related inequalities with applications in information theory ⋮ Entropy bounds for dendrimers ⋮ The multidimensional moment problem with complexity constraint ⋮ Efficient information theoretic inference for conditional moment restrictions ⋮ Tighter uncertainty principles based on quaternion Fourier transform ⋮ Integral representations and asymptotic expansions for Shannon and Renyi entropies ⋮ On conclusive eavesdropping and measures of mutual information in quantum key distribution ⋮ Swiveled Rényi entropies ⋮ On the granularity of summative kernels ⋮ The Lomax generator of distributions: properties, minification process and regression model ⋮ Observations on entanglement entropy in massive QFT's ⋮ Information gain and approaching true belief ⋮ Bounds for \(f\)-divergences under likelihood ratio constraints. ⋮ A family of generalized quantum entropies: definition and properties ⋮ Rényi entropies for free field theories ⋮ A necessary and sufficient condition for the inequality of generalized weighted means ⋮ From \(\varepsilon\)-entropy to KL-entropy: analysis of minimum information complexity density estima\-tion ⋮ Uncertainty and certainty relations for complementary qubit observables in terms of Tsallis' entropies ⋮ Gibbs' theorem for open systems with incomplete statistics ⋮ An approach to multiway contingency tables based on \(\phi \)-divergence test statistics ⋮ Typical Rényi dimensions of measures. The cases: \(q=1\) and \(q=\infty\) ⋮ Quasi-concave density estimation ⋮ On partial monotonic behaviour of some entropy measures ⋮ Limit theory for point processes in manifolds ⋮ On the dynamic survival entropy ⋮ Tensor network contractions for \#SAT ⋮ Minimum divergence estimators, maximum likelihood and exponential families ⋮ Analyzing supersaturated designs with entropic measures ⋮ Rényi entropies and nonlinear diffusion equations ⋮ Option price calibration from Rényi entropy ⋮ Results on residual Rényi entropy of order statistics and record values ⋮ Determining the number of clusters using information entropy for mixed data ⋮ Independent attacks in imperfect settings: a case for a two-way quantum key distribution scheme ⋮ The Kumaraswamy Marshal-Olkin family of distributions ⋮ Marshall-Olkin generalized exponential distribution ⋮ Bayesian hybrid generative discriminative learning based on finite Liouville mixture models ⋮ Equilibrium statistical mechanics for incomplete nonextensive statistics ⋮ An information-theoretic characterization of weighted \(\alpha \)-proportional fairness in network resource allocation ⋮ Tsallis entropy measure of noise-aided information transmission in a binary channel ⋮ Rényi divergence and \(L_p\)-affine surface area for convex bodies ⋮ Distances and discrimination rates for stochastic processes ⋮ Multivariate measures of skewness for the skew-normal distribution ⋮ Spread of highly localized wave-packet in the tight-binding lattice: entropic and information-theoretical characterization ⋮ Characterization properties of the log-normal distribution obtained with the help of divergence measures ⋮ Information measures based on fractional calculus ⋮ Trees with exponentially growing costs ⋮ Multi-sample Rényi test statistics ⋮ Divergence-based tests of homogeneity for spatial data ⋮ Making the Cauchy work ⋮ Private information and the `Information function': A survey of possible uses ⋮ On the minimization of concave information functionals for unsupervised classification via decision trees ⋮ On the entropy of a hidden Markov process ⋮ Lindley distribution and its application ⋮ Weighted Dagum and related distributions ⋮ Some applications of majorization to quantum entropies ⋮ On the way towards a generalized entropy maximization procedure ⋮ Characterizations of generalized entropy functions by functional equations ⋮ Entropies and Heun functions associated with positive linear operators ⋮ Uncertainty relations for general canonically conjugate observables in terms of unified entropies ⋮ Some new results on the Rényi quantile entropy ordering ⋮ Analyzing anonymity attacks through noisy channels ⋮ Exponentiated Kumaraswamy-Dagum distribution with applications to income and lifetime data ⋮ The odd generalized exponential family of distributions with applications ⋮ The beta Marshall-Olkin family of distributions ⋮ Asymptotical improvement of maximum likelihood estimators on Kullback-Leibler loss ⋮ Note on von Neumann and Rényi entropies of a graph ⋮ Robust coding for a class of sources: Applications in control and reliable communication over limited capacity channels ⋮ A class of Rényi information estimators for multidimensional densities ⋮ Executives' perceived environmental uncertainty shortly after 9/11 ⋮ Re-parameterization of multinomial distributions and diversity indices ⋮ Divergences test statistics for discretely observed diffusion processes ⋮ Rényi statistics for testing equality of autocorrelation coefficients ⋮ Some new fuzzy entropy formulas ⋮ Optimal vector quantization in terms of Wasserstein distance ⋮ The Blackwell and Dubins theorem and Rényi's amount of information measure: Some applications ⋮ Some characterization results on generalized cumulative residual entropy measure ⋮ Partial monotonicity of entropy measures ⋮ Reversed version of a generalized sharp Hölder's inequality and its applications ⋮ Development of two new mean codeword lengths ⋮ Statistical inference for the \(\epsilon \)-entropy and the quadratic Rényi entropy ⋮ \(\beta\)-entropy for Pareto-type distributions and related weighted distributions ⋮ Reliability model using truncated skew-Laplace distribution ⋮ Relative entropy and holography ⋮ The relative useful information measure: Some comments ⋮ Unified (r,s)-entropy and its bivariate measures ⋮ A generalized class of certainty and information measures ⋮ General bootstrap for dual \(\phi\)-divergence estimates ⋮ Sampling hyperspheres via extreme value theory: implications for measuring attractor dimensions ⋮ How noise affects effort in tournaments ⋮ Characterization of a quantitative-qualitative measure of relative information ⋮ Criteria of self-organization. ⋮ Statistical approach to fractal-structured systems: an illustration from the physics of semiconductor heterostructures ⋮ On entropy, entropy-like quantities, and applications ⋮ Central limit theorem and convergence to stable laws in Mallows distance
This page was built for publication: