On Information and Sufficiency

From MaRDI portal
Publication:5804592


DOI10.1214/aoms/1177729694zbMath0042.38403WikidataQ56286611 ScholiaQ56286611MaRDI QIDQ5804592

Solomon Kullback, Richard Leibler

Publication date: 1951

Published in: The Annals of Mathematical Statistics (Search for Journal in Brave)

Full work available at URL: https://doi.org/10.1214/aoms/1177729694



Related Items

An entropic framework for the normal distribution in capability analysis, A bayesian method in determining the order of a finite state markov chain, Data processing using information theory functionals, Data processing using information theory functionals, The reconstruction of a positive function from its finite fourier series, On the Fisher Information, Combining monte carlo and cox tests of non-nested hypotheses, A sequential estimation procedure for m-dimensional gaussian processes with independent inerements, On the choice of a discrepancy functional for model selection, Kullback-leibler information approach to the optimum measurement point for bayesian estimation, Optimal control designs using predicting densities for the multivariate linear model, On shrinkage to interval estimators of the binomial p, A test for independence based on the correlation dimension, THE POWER AND SIZE OF NONPARAMETRIC TESTS FOR COMMON DISTRIBUTIONAL CHARACTERISTICS, Novel strategies to approximate probability trees in penniless propagation, Generalized Jeffrey's rule of conditioning and evidence combining rule for a priori probabilistic knowledge in conditional evidence theory, Measuring stochastic dependence using \(\phi\)-divergence, Corrected version of \(AIC\) for selecting multivariate normal linear regression models in a general nonnormal case, Nonlinear regression modeling using regularized local likelihood method, On the averaging of symmetric positive-definite tensors, The \(K_{\varphi}\)-divergence statistic for categorical data problems, Regularity properties and pathologies of position-space renormalization-group transformations: scope and limitations of Gibbsian theory, Estimation of Kullback-Leibler divergence by local likelihood, Robust and efficient parametric estimation for censored survival data, Fully probabilistic control design, Bayesian nonparametric model selection and model testing, Inaccuracy and coding theory, Nonsymmetrical distance between probability distributions, entropy and the theorem of Pythagoras, Entropy, inaccuracy and information, On amount of information of type-\(\beta\) and other measures, Computational approaches to parameter estimation and model selection in immunology, An axiomatic approach to the definition of the entropy of a discrete Choquet capacity, The probability to select the correct model using likelihood-ratio based criteria in choosing between two nested models of which the more extended one is true, Vector quantization using information theoretic concepts, Some results on generalized residual entropy, A logic for inductive probabilistic reasoning, Unnamed Item, Information-statistical pattern based approach for data mining, Free energies based on generalized entropies and H-theorems for nonlinear Fokker–Planck equations, Modeling nonlinear time series with local mixtures of generalized linear models, A new treatment of communication processes with Gaussian channels, Unnamed Item, Unnamed Item, Lazy evaluation in penniless propagation over join trees, Entropy Rate and Maximum Entropy Methods for Countable Semi-Markov Chains, An Entropy Frailty Model for Dependent Variables, Unnamed Item, On the écart between two “amounts of information”, ENTROPY OF ORDER α AND TYPE β AND SHANNON'S INEQUALITY, Sobre el tamaño de muestra para experimentos aleatorios con imprecision difusa, Analisis bayesiano de los contrastes de hipotesis parametricos, On non-Gaussianity and dependence in financial time series: a nonextensive approach, A NONPARAMETRIC BOOTSTRAP TEST OF CONDITIONAL DISTRIBUTIONS, A Paradigm for Masking (Camouflaging) Information, Unnamed Item, Unnamed Item, The extraction of information from multiple point estimates, Sensitivity Analysis in Gaussian Bayesian Networks Using a Divergence Measure, Continuity bounds on the quantum relative entropy, An H-theorem for the general relativistic Ornstein-Uhlenbeck process, Some tests for the power series distibutions in one parameter using the kullback-leibler information measure, A NEW DIRECTED DIVERGENCE MEASURE AND ITS CHARACTERIZATION∗, ON THE FAMILIES OF SOLUTIONS TO GENERALIZED MAXIMUM ENTROPY AND MINIMUM CROSS-ENTROPY PROBLEMS, Information-theoretic approach to classifying operators in conveyor systems, Interval estimation for the exponential inverse Gaussian distribution, Optimal Portfolio Diversification Using the Maximum Entropy Principle, Information Affinity: A New Similarity Measure for Possibilistic Uncertain Information, Optimal Observation Times in Experimental Epidemic Processes, AN AXIOMATIC DEFINITION OF FUZZY DIVERGENCE MEASURES, BOUNDS FOR THE DEVIATION OF A FUNCTION FROM THE CHORD GENERATED BY ITS EXTREMITIES, Testing for Homogeneity in Mixture Using Weighted Relative Entropy, A Method to Generate Multivariate Data with the Desired Moments, Hamiltonian identification for quantum systems: well-posedness and numerical approaches, Fitting Mixture Distributions Using Generalized Lambda Distributions and Comparison with Normal Mixtures, The Variational Gaussian Approximation Revisited, Performance of information criteria for spatial models, Optimal experimental control in econometrics: the simultaneous equation problem, A design of single sampling plans by attributes based on the Kullback-Leibler information, A design of single sampling plans by variables based on Kullback-Leibler information, ON SOME INEQUALITIES AND GENERALIZED ENTROPIES: A UNIFIED APPROAC, Adaptive m estimation of symmetric distribution location, An extension of bayesian measure of information to regression, Information theoretical mortality table graduation, Mathematical techniques for quantum communication theory, Transformacion raiz cuadrada versus logaritmica de una distribucion Gamma, M.D.I. estimation via unconstrained convex programming, Renewal theory and the sequential design of experiments with two states of nature, Unnamed Item, Distance measures for stochastic models, A bayesian approach to detect informative observations in an experiment, Unnamed Item, Unnamed Item, Invariants of the Markov process by the transformation of variables, A comparison of the estimative and predictive methods of estimating posterior probabilities, On the fisher β-sufficient partition, The kullback - leibler approximation of the marginal posterior density: An application to the linear functional model, Unnamed Item, An alternative bayesian approach to the multivariate behrens-fisher problem, On the use of divergence statistics to make inferences about three habitats, Generalized Jensen difference divergence measures and Fisher measure of information, A minimum discrimination information estimator of preliminary conjectured normal variance, Conditional iterative proportional fitting for Gaussian distributions, Informational complexity criteria for regression models., A consistent nonparametric test for serial independence, On the consistency of the maximum spacing method, Estimation of the mean and the covariance matrix under a marginal independence assumption -- an application of matrix differential calculus, Connections of generalized divergence measures with Fisher information matrix, Some statistical applications of generalized Jensen difference divergence measures for fuzzy information systems, Monotonicity of the Fisher information and the Kullback-Leibler divergence measure, On testing independence in multidimensional contingency tables with stratified random sampling, Asymptotic properties of divergence statistics in a stratified random sampling and its applications to test statistical hypotheses, On entropy functionals of states of operator algebras, The Kullback-Leibler risk of the Stein estimator and the conditional MLE, Generalized divergence measures and the probability of error, On measuring the distance between histograms,, Using probability trees to compute marginals with imprecise probabilities, Neural networks and logistic regression: Part I, Generalized divergence measures: Information matrices, amount of information, asymptotic distribution, and its applications to test statistical hypotheses, Duality and equilibrium prices in economics of uncertainty, Indistinguishability of particles or independence of the random variables?, A comparison of some estimators of the mixture proportion of mixed normal distributions, On a unification of divergences by means of information amounts, Modeling belief in dynamic systems. I: Foundations, Predicting the number of accidents at a road junction, Sensitivity and robustness in selection problems., Parametric extensions of Shannon inequality and its reverse one in Hilbert space operators., Tomographic analysis of sign-altering functions by maximum entropy method, The econometric consequences of the ceteris paribus condition in economic theory, Some new statistics for testing point null hypotheses with prior information, Importance sampling in Bayesian networks using probability trees., On preferred point geometry in statistics, Evaluation of Bayesian networks with flexible state-space abstraction methods, Learning Bayesian networks from data: An information-theory based approach, Prior knowledge for learning networks in non-probabilistic settings, Time series clustering with ARMA mixtures, Minimum \(K_\phi\)-divergence estimator., A predictive density approach to predicting a future observable in multilevel models, A distance measure for bounding probabilistic belief change, Utility, informativity and protocols, A measure of discrimination between past lifetime distributions, Computational modelling with functional differential equations: identification, selection, and sensitivity, Higher order optimal approximation of Csiszar's \(f\)-divergence, Inequalities for quantum relative entropy, New lower bounds for statistical query learning, Statistical management of fuzzy elements in random experiments. II: The Fisher information associated with a fuzzy information system, Asymptotic behaviour and statistical applications of divergence measures in multinomial populations: A unified study, Information theory and superefficiency, A measure of discrimination between two residual life-time distributions and its applications, A prallel algorithm for building possibilistic causal networks, The curvature induced by covariance, A generalized maxentropic inversion procedure for noisy data., Host-based intrusion detection using dynamic and static behavioral models, Information and entropy econometrics -- editor's view., Information indices: Unification and applications., Information-theoretic estimation of preference parameters: macroeconomic applications and simulation evidence, Information theoretic measures of the income distribution in food demand, Uses of entropy and divergence measures for evaluating econometric approximations and infer\-ence., Brain electrical activity analysis using wavelet-based informational tools. II: Tsallis non-extensivity and complexity measures, A generalized \(\varphi\)-divergence for asymptotically multivariate normal models., AIC, overfitting principles, and the boundedness of moments of inverse matrices for vector autotregressions and related models., Data compression and learning in time sequences analysis, Statistical complexity and disequilibrium, An information-geometric approach to a theory of pragmatic structuring, On exchangeable, causal and cascading failures, Information complexity criteria for detecting influential observations in dynamic multivariate linear models using the genetic algorithm, Asymptotic theory for information criteria in model selection -- functional approach, Partial information reference priors: Derivation and interpretations, Jaynes approach to the dynamics of Darwin systems, Selection of smoothing parameters in \(B\)-spline nonparametric regression models using information criteria, Relative information of type \(s\), Csiszár's \(f\)-divergence, and information inequalities, Generalized regression trees, Limit theorems for the logarithm of sample spacings, Local discriminant bases and their applications, Entropy, divergence and distance measures with econometric applications, Information criteria for selecting possibly misspecified parametric models, Nested models for categorical data, A large-sample model selection criterion based on Kullback's symmetric divergence, Akaike's information criterion and recent developments in information complexity, Key concepts in model selection: Performance and generalizability, Information theoretic criteria in non-parametric density estimation. Bias and variance in the infinite dimensional case, Asymptotic properties of \((r,s)\)-directed divergences in a stratified sampling, The \(\lambda\)-divergence and the \(\lambda\)-mutual information: Estimation in the stratified sampling, On the evolutionary selection of sets of Nash equilibria, Quantifying filter bank decorrelating performance via matrix diagonality, Proportional reversed hazard rate model and its applications, Model selection criteria based on Kullback information measures for nonlinear regression, Probability distribution of wave velocity in heterogeneous media due to random phase configuration, Probability distributions conditioned by the available information: Gamma distribution and moments, Dynamically stable sets in infinite strategy spaces, Rényi statistics for testing hypotheses in mixed linear regression models, Probabilistic distance measures of the Dirichlet and beta distributions, Gelfand-Yaglom-Perez theorem for generalized relative entropy functionals, On logarithmic convexity for differences of power means, Continuity of information transport in surjective cellular automata, Variational posterior distribution approximation in Bayesian super resolution reconstruction of multispectral images, On the \(J\)-divergence of intuitionistic fuzzy sets with its application to pattern recognition, Testing nonparametric and semiparametric hypotheses in vector stationary processes, A symmetric information divergence measure of the Csiszár's \(f\)-divergence class and its bounds, A Bayesian approach to testing decision making axioms, A formal approach to using data distributions for building causal polytree structures, Thermodynamic uncertainty relations and irreversibility, Joint probability distribution of composite quantum systems, Effects of collinearity on information about regression coefficients, On the loss of information due to fuzziness in experimental observations, On minimum information prior distributions, Entropy differential metric, distance and divergence measures in probability spaces: A unified approach, Minimum chi-square estimation and tests for model selection, Order independence and factor convergence in iterative scaling, Divergence measure between fuzzy sets, A classification of the main probability distributions by minimizing the weighted logarithmic measure of deviation, Between-group analysis with heterogeneous covariance matrices: The common principal component model, On the entropy and inaccuracy of similarly and oppositely ordered discrete probability distributions. I, Admissibility and complete class results for the multinomial estimation problem with entropy and squared error loss, On parametric weighted information improvement, Robust inference of trees, Intrinsic credible regions: an objective Bayesian approach to interval estimation (with comments and rejoinder), Selecting features in microarray classification using roc curves, Entropy for semi-Markov processes with Borel state spaces: asymptotic equirepartition properties and invariance principles, Bounds for \(f\)-divergences under likelihood ratio constraints., The max-min hill-climbing Bayesian network structure learning algorithm, A maximum entropy method for particle filtering, Bias correction of cross-validation criterion based on Kullback-Leibler information under a general condition, Interpreting Kullback--Leibler divergence with the Neyman-Pearson Lemma, The Cauchy--Schwarz divergence and Parzen windowing: Connections to graph theory and Mercer kernels, Moments of utility functions and their applications, Hill-climbing and branch-and-bound algorithms for exact and approximate inference in credal networks, Log-concavity and the maximum entropy property of the Poisson distribution, Decomposing a relation into a tree of binary relations, Bounds on the probability of error in terms of generalized information radii, Some aspects of quantum information theory and their applications to irreversible processes, Model validation and predictive capability for the thermal challenge problem, Objective Bayesianism with predicate languages, Werner states and the two-spinors Heisenberg anti-ferromagnet, Chemical bonds through probability scattering: Information channels for intermediate-orbital stages, Nonlinear regression modeling via regularized radial basis function networks, Optimal experimental design criterion for discriminating semiparametric models, Statistical information approaches for the modelling of the epileptic brain, On a new moments inequality, Extreme inaccuracies in Gaussian Bayesian networks, A note on information entropy measures for vague sets and its applications, Learning decision trees with taxonomy of propositionalized attributes, Agglomerative hierarchical clustering of continuous variables based on mutual information, Relevance measures for subset variable selection in regression problems based on \(k\)-additive mutual information, A model selection criterion based on the BHHJ measure of divergence, A semiparametric model selection criterion with applications to the marginal structural model, A hybrid EM approach to spatial clustering, On the distance between some \(\pi\)ps sampling designs, Approximate probability propagation with mixtures of truncated exponentials, Distribution metrics and image segmentation, Projection-based Bayesian recursive estimation of ARX model with uniform innovations, Revisiting prior distributions. II: Implications of the physical prior in maximum entropy analysis, On the geometry of generalized Gaussian distributions, Bhattacharyya statistical divergence of quantum observables, Dealing with label switching in mixture models under genuine multimodality, Shrinkage estimation in the frequency domain of multivariate time series, Object matching in disjoint cameras using a color transfer approach, Estimative influence measures for the multivariate general linear model, The relative information generating function, A conditional limit construction of the normal probability density, Sufficient subalgebras and the relative entropy of states of a von Neumann algebra, On convergence of conditional probability measures, On symmetry and the directed divergence in information theory, The relative dimension of a probabilistic experiment, Parameter estimation of partially observed continuous time stochastic processes via the EM algorithm, The application of the principle of minimum cross-entropy to the characterization of the exponential-type probability distributions, Quasicyclic symmetry and the directed divergence in information theory, Model selection and Akaike's information criterion (AIC): The general theory and its analytical extensions, On mathematical information channels with a non-commutative intermediate system, Selecting the best linear regression model. A classical approach, The role of duality in optimization problems involving entropy functionals with applications to information theory, Statistical aspects of divergence measures, Restricted exponential forgetting in real-time identification, Properties of the relative entropy of states of von Neumann algebras, Decentralized detection by a large number of sensors, A unified framework for connectionist systems, Information theory as a unifying statistical approach for use in marketing research, The geometrical structure of the parameter space of the two-dimensional normal distribution, Efficient feature-subset selection with probabilistic distance criteria, Finitely determined processes - An indiscrete approach, A comparison of the information and posterior probability criteria for model selection, Information in experiments and sufficiency, The Student distribution and the principle of maximum entropy, Maximum entropy interpretation of autoregressive spectral densities, Certainty equivalents and information measures: Duality and extremal principles, Cross entropy minimization in uninvadable states of complex populations, The proper formula for relative entropy and its asymptotics in quantum probability, \((R,S)\)-information radius of type \(t\) and comparison of experiments, Convergence properties of MLE's and asymptotic simultaneous confidence intervals in fitting cardinal B-splines for density estimation, Minimum cross-entropy analysis with entropy-type constraints, Fuzziness in the experimental outcomes: Comparing experiments and removing the loss of information, The measure-theoretic aspects of entropy. I, Deducing the Schrödinger equation from minimum \(\chi{}^ 2\), Seasonally and approximation errors in rational expectations models, Characterization of the relative entropy of states of matrix algebras, Estimating a model through the conditional MLE, Relative entropy under mappings by stochastic matrices, A Bayesian alternative to parametric hypothesis testing, Estimative and predictive distances, Markovian representation of stochastic processes and its application to the analysis of autoregressive moving average processes, Bayesian discriminant approach to input signal selection in parameter estimation problems, Unnamed Item, Independency relationships and learning algorithms for singly connected networks, A BAYESIAN INTERPRETATION OF MULTIPLE POINT ESTIMATES, A radial basis function artificial neural network test for neglected nonlinearity, Implications of Form Invariance to the Structure of Nonextensive Entropies, Entropy estimation of symbol sequences, A multivariate Gompertz-type distribution, Unnamed Item, When has estimation reached a steady state? The Bayesian sequential test, UNCOVERING SHORT-TIME CORRELATIONS BETWEEN MULTICHANNEL RECORDINGS OF BRAIN ACTIVITY: A PHASE-SPACE APPROACH, Kullback-leibler information and interval estimation, Mixture‐based adaptive probabilistic control, Estimation and prediction with ARMMAX model: a mixture of ARMAX models with common ARX part, The predictive influence of variables in a Normal Regression Model, A NEW METHOD FOR COMPARING EXPERIMENTS AND MEASURING INFORMATION, Unnamed Item, Closeness of Gamma and Generalized Exponential Distribution, On detecting change in likelihood ratio ordering, Locality of Global Stochastic Interaction in Directed Acyclic Networks, CONTINUITY OF A CLASS OF ENTROPIES AND RELATIVE ENTROPIES, THE LOG-EIG DISTRIBUTION: A NEW PROBABILITY MODEL FOR LIFETIME DATA, On double stage minimum discrimination information estimators of the interval constrained normal mean, On Unified (R,S)-Information Measures, Bayesian diagnostics for cheking assumptions of normality, Strong Limit Theorems for Sums of Logarithms of High Order Spacings, On testing hypotheses with doubly censored data, Tests of Equality of Parameters of Two Normal Populations in Bayesian Viewpoint, Me\fehler und Information, Local diffusion models for stochastic reacting systems: estimation issues in equation-free numerics, Inverse problems and model validation: an example from latent virus reactivation, Surrogate Marker Evaluation from an Information Theory Perspective, Quantifying the Effect of the Surrogate Marker by Information Gain, Representations of Space and Time in the Maximization of Information Flow in the Perception-Action Loop, Generalized arithmetic and geometric mean divergence measure and their statistical aspects, Predictive Influence of Variables in a Multivariate Distribution in Presence of Perfect Multicollinearity, A Procedure for Identification of Principal Variables by Least Generalized Dependence, Regression and ICOMP—A Simulation Study, Properties of Entropies of Record Values in Reliability and Life Testing Context, Conditional expectation in an operator algebra. IV. Entropy and information, On information in operator algebras, Large deviations of divergence measures on partitions, Information measures for global geopotential models, Generalized arithmetic and geometric mean divergence measure and their statistical aspects, A hybrid methodology for learning belief networks: BENEDICT, Role and results of statistical methods in protein fold class prediction, The Equivalence of Optimum Transducers and Sufficient and Most Efficient Statistics