Publication:3260839

From MaRDI portal


zbMath0088.10406MaRDI QIDQ3260839

Solomon Kullback

Publication date: 1959



62-01: Introductory exposition (textbooks, tutorial papers, etc.) pertaining to statistics

94A15: Information theory (general)

62B10: Statistical aspects of information-theoretic topics


Related Items

Thermodynamic uncertainty relations and irreversibility, Effects of collinearity on information about regression coefficients, Convergence properties of high-order Boltzmann machines, A classification of the main probability distributions by minimizing the weighted logarithmic measure of deviation, Limiting values of large deviation probabilities of quadratic statistics, Between-group analysis with heterogeneous covariance matrices: The common principal component model, Verteilungsmaße und Verteilungsindizes, An application of the discrimination information measure to the theory of testing hypotheses. I, A quantitative Occam's razor, Bounds for \(f\)-divergences under likelihood ratio constraints., Information amount and higher-order efficiency in estimation, Behaviour of the Fokker-Planck-Boltzmann equation near a Maxwellian, Application of data compression methods to nonparametric estimation of characteristics of discrete-time stochastic processes, Sequential category aggregation and partitioning approaches for multi-way contingency tables based on survey and census data, Point estimation with exponentially tilted empirical likelihood, Asymptotic inference for semiparametric association models, The principle of maximum entropy, A maximum entropy criterion of filtering and coding for stationary autoregressive signals: Its physical interpretations and suggestions for its application to neural information transmission, Information theoretic analysis for a general queueing system at equilibrium with application to queues in tandem, On convergence of conditional probability measures, The application of the principle of minimum cross-entropy to the characterization of the exponential-type probability distributions, Model selection and Akaike's information criterion (AIC): The general theory and its analytical extensions, The role of duality in optimization problems involving entropy functionals with applications to information theory, Bayesian approach to thermostatistics, Properties of the relative entropy of states of von Neumann algebras, Information and statistics. I, Predictive efficiency for simple non-linear models, Quadratically constrained minimum cross-entropy analysis, Akaike's information criterion and Kullback-Leibler loss for histogram density estimation, An informational divergence geometry for stochastic matrices, An extension of the method of maximum likelihood and the Stein's problem, An objective use of Bayesian models, Measuring truthlikeness, Stochastic operators, information, and entropy, Informational divergence and the dissimilarity of probability distributions, General two-place information functions, Local comparison of linear rank tests, in the Bahadur sense, Stochastic programming with random processes, Cross entropy minimization in uninvadable states of complex populations, Asymmetric Boltzmann machines, Some new information measures for fuzzy sets, Characterization of the relative entropy of states of matrix algebras, Constructing elementary procedures for inference of the gamma distribution, Estimating a model through the conditional MLE, A Bayesian alternative to parametric hypothesis testing, Expectations and entropy inequalities for finite quantum systems, A comparative study of association measures, Completely positive maps and entropy inequalities, On information-improvement, A sequential theory of psychological discrimination, On some functional equations concerning entropy, directed divergence and inaccuracy, Explicativity, corroboration, and the relative odds of hypotheses. With comments by William L. Harper and John R. Wettersten, Entropy of type \((\alpha,\beta)\) and other generalized measures in information theory, On generalized information function, Sub additive measures of relative information and inaccuracy, On axiomatic characterization of some non-additive measures of information, Tests of symmetry in three-way contingency tables, Note on generalized information function, Bahadur efficiency and local asymptotic optimality of certain nonparametric tests for independence, Automatic aggregation of categories in multivariate contingency tables using information theory., Combining ranked mean value forecasts, Subjective estimation of the delay time distribution in maintenance modelling, A consistent nonparametric test for serial independence, Maximum likelihood procedure adapted to sampling schemes, On entropy functionals of states of operator algebras, The estimation of prior from Fisher information, The Kullback-Leibler risk of the Stein estimator and the conditional MLE, An information theoretic argument for the validity of the exponential model, Testing for nonlinearity using redundancies: Quantitative and qualitative aspects, A maximum entropy approach to estimation and inference in dynamic models or Counting fish in the sea using maximum entropy, Information and probabilistic reasoning, Entropy of sums of random digits, Estimation and inference with censored and ordered multinomial response data, An \(R\)-squared measure of goodness of fit for some common nonlinear regression models, Characterizations of sum form information measures on open domains, Duality and equilibrium prices in economics of uncertainty, On the entropic regularization method for solving min-max problems with applications, A survey of some mathematical programming models in transportation, Asymptotics of the necessary sample size under small error probabilites, On the phase space approach to complexity, Information theory and electron density, Some properties of affinity and applications, Relative information functions and their type (\(\alpha, \beta\)) generalizations, Generalized maximum entropy estimation of dynamic programming models with sample selection bias, Large deviations from the thermodynamic limit in globally coupled maps, Information transfer in continuous processes, Information theoretic framework for process control, Extremal exponents of random dynamical systems do not vanish, A measure of discrimination between two residual life-time distributions and its applications, Estimation of best predictors of binary response, Discrimination distance bounds and statistical applications, Loglinear models and categorical data analysis with psychometric and econometric applications, The significance test controversy, Vines -- a new graphical model for dependent random variables., A generalized maxentropic inversion procedure for noisy data., Information and entropy econometrics -- editor's view., Information indices: Unification and applications., Connections between entropic and linear projections in asset pricing estimation, Limited information likelihood and Bayesian analysis, Information theoretic measures of the income distribution in food demand, An entropic framework for the normal distribution in capability analysis, COMPUTERIZED METHODOLOGY FOR THE EVALUATION OF LEVEL OF KNOWLEDGE, A Justification for Applying the Principle of Minimum Relative Entropy to Information Integration Problems, On the classification of observations structured into groups, Asymptotic Confidence Intervals for the Relative Relapse Rate Under Random Censorship, Efficiency in the Use of Hotellings T2, USING THE MUTUAL INFORMATION COEFFICIENT TO IDENTIFY LAGS IN NONLINEAR MODELS, A modified likelihood ratio test for the mean direction in the von mises distribution, On the choice of a discrepancy functional for model selection, Kullback-leibler information approach to the optimum measurement point for bayesian estimation, Minimally informative distributions with given rank correlation for use in uncertainty analysis, Post-processing techniques for the joint CEC/USNRC uncertainty analysis of accident consequence codes, A consistency algorithm based on information theory, Riemannian and Finslerian geometry in thermodynamics, Application of the method of incremental coefficients (MIC) algorithm to inertial systems, Loss-based optimal control statistics for control charts, Optimal Unconditional Asymptotic Test in 2 × 2 Multinomial Trials, A Simulation Study to Investigate the Behavior of the Log-Density Ratio Under Normality, A NEW ALGORITHM FOR ESTIMATING THE RISK OF NATURAL DISASTERS WITH INCOMPLETE DATA, A new approach to goodness-of-fit testing based on the integrated empirical process*, Implications of Form Invariance to the Structure of Nonextensive Entropies, The information for the direction of dependence in l1regression, Unnamed Item, QUANTUM MECHANICS AND PATTERN RECOGNITION, UNCOVERING SHORT-TIME CORRELATIONS BETWEEN MULTICHANNEL RECORDINGS OF BRAIN ACTIVITY: A PHASE-SPACE APPROACH, A new class of random vector entropy estimators and its applications in testing statistical hypotheses, Projections of probability measures, A Modified Akaike Criterion for Model Choice in Generalized Linear Models, Resolving hypotheses with successive chisquares, A New Approach of Information Discrepancy to Analysis of Questionnaire Data, Unnamed Item, Dimension Reduction with Linear Discriminant Functions Based on an Odds Ratio Parameterization, Evaluation of the Kullback‐Leibler Discrepancy for Model Selection in Open Population Capture‐Recapture Models, Hypothesis testing for arbitrarily varying source, COMPLEXITY AS A MEASURE OF THE DIFFICULTY OF SYSTEM DIAGNOSIS, A GENERALIZATION OF SHANNON'S INFORMATION THEORY, On tests of symmetry, marginal homogeneity and quasi-symmetry in two-way contingency tables based on minimum φ-divergence estimator with constraints, DIC in variable selection, Sampling distributions associated with the multivariate t distribution, Testing hypotheses for Markov chains when the parameter space is finite, Unnamed Item, Testing goodness-of-fit for Laplace distribution based on maximum entropy, Unnamed Item, AN OLD‐NEW CONCEPT OF CONVEX RISK MEASURES: THE OPTIMIZED CERTAINTY EQUIVALENT, A Viable Alternative to Resorting to Statistical Tables, Unnamed Item, Unnamed Item, APPLICATIONS OF DENSITY MATRICES IN A TRAPPED BOSE GAS, Asymptotic behavior of sequential design with costs of experiments. (The case of normal distribution), Asymptotically most informative procedure in the case of exponential families, Equivalence of the maximum likelihood estimator to a minimum entropy estimator, Relative efficiency of the Wald ${\rm SPRT}$ and the Chernoff information number, Interaction information in multivariate probability distributions, Conditional expectation in an operator algebra. IV. Entropy and information, Asymptotic behavior of sequential design with costs of experiments, The asymptotic distribution of information per unit cost concerning a linear hypothesis for means of two given normal populations, Axiomatic characterizations of some measures of divergence in information, Asymtotic confidence regions and likelihood ratio tests of hypothesis for location and scale parameters based on type II censored samples, On Shannon's entropy, directed divergence and inaccuracy, Generalized interpolation in 𝐻^{∞} with a complexity constraint, A simultaneous estimation and variable selection rule, Determination and interpretation of preferred orientation with texture goniometry: An application of indicators to maximum entropy pole- to orientation-density inversion, Maximum entropy and Bayesian approaches to the ratio problem, Optimality, entropy and complexity for nonextensive quantum scattering, Unnamed Item, Unnamed Item, Unnamed Item, Entropy in linear programs, Bayesian clustering of data sets, Computer classification of the EEG time series by Kullback information measure, RECONSTRUCTABILITY ANALYSIS: Overview and Bibliography†, Unnamed Item, Über eine Klasse von Informationsmaßen für die Bewertung stochastischer (partieller) Informationen, The structure of indices of social mobility and inheritance, Unnamed Item, On The Principle of Minimum interdependence, Some strategies for mastermind, Metricas riemanianas asociadas a M-divergencias, Unnamed Item, Tracking control of non-linear stochastic systems by using path cross-entropy and Fokker-Planck equation, Unnamed Item, Information theoretic multivariate graduation, Unnamed Item, Unnamed Item, On $3$-dimensional interaction information, Entropy, information flow and variance in regulatory control systems, Invariants of the Markov process by the transformation of variables, Synthesis of input signals in parameter identification in static systems, Information analysis of linear interactions in contingency tables, A stepwise discrete variable selection procedure, The minimum discrimination information approach in analyzing categorical data, Input design for linear dynamic systems using maxmin criteria, An alternative bayesian approach to the multivariate behrens-fisher problem, Data compression and learning in time sequences analysis, Information processes for semimartingale experiments, Class visualization of high-dimensional data with applications., Game theory, maximum entropy, minimum discrepancy and robust Bayesian decision theory, Minimum discrimination information estimator of the mean with known coefficient of variation, A Bayesian approach to diagnosis of asset pricing models, A measure concentration inequality for contracting Markov chains, An optimal property of the exact multinomial test and the extended Fisher's exact test, Global statistical information in exponential experiments and selection of exponential models, Limiting properties of some measures of information, Economical experiments: Bayesian efficient experimental design, Probability distribution of wave velocity in heterogeneous media due to random phase configuration, Probability distribution of extreme values of wave velocity in stochastic heterogeneous media, Information theory in living systems, methods, applications, and challenges, Stable simulation of fluid flow with high-Reynolds number using Ehrenfest's steps, An immune algorithm with stochastic aging and Kullback entropy for the chromatic number problem, Numerical method for estimating multivariate conditional distributions, Derivation of mixture distributions and weighted likelihood function as minimizers of KL-divergence subject to constraints, An iterative procedure for general probability measures to obtain \(I\)-projections onto intersections of convex sets, Regularity properties and pathologies of position-space renormalization-group transformations: scope and limitations of Gibbsian theory, Differential entropy and dynamics of uncertainty, On characterization of the Kullback-Leibler mean information for continuous probability distributions, Necessary conditions for the convergence of Kullback-Leibler's mean information, Nonsymmetrical distance between probability distributions, entropy and the theorem of Pythagoras, Explanation, prediction, description, and information theory, Intraclass contingency tables, On the asymptotic distribution of the likelihood ratio under the regularity conditions due to Doob, Some models for individual-group comparisons and group behavior, On the solution of a functional inequality and its applications, On the f-divergence and singularity of probability measures, A class of measures of informativity of observation channels, Some limiting properties of Matusita's measure of distance, Some characterization theorems for generalized measures of uncertainty and information, On the concept of relative information, Dualistic differential geometry of positive definite matrices and its applications to related problems, A remark on the convergence of Kullback-Leibler's mean information, Some information theoretic ideas useful in statistical inference, On large time asymptotics for drift-diffusion-poisson systems, Information-statistical pattern based approach for data mining, ON CONVEX SOBOLEV INEQUALITIES AND THE RATE OF CONVERGENCE TO EQUILIBRIUM FOR FOKKER-PLANCK TYPE EQUATIONS, A maximum relative entropy principle for distribution of personal inc.Ome with derivations of several known inc.Ome distributions, Compromise between generalized bayes and bayes estimators of poisson means under entropy loss, Bhattacharyya distance based linear discriminant function for stationary time series, On a general concept of forgetting, Logistic Regression, a review, Unnamed Item, AXIOMATIC DERIVATION OF THE MUTUAL INFORMATION PRINCIPLE AS A METHOD OF INDUCTIVE INFERENCE, MAXIMUM ENTROPY REVISITED, Una nota sobre la cuantificacion de la incertidumbre correspondiente a las utilidades, Analisis bayesiano de los contrastes de hipotesis parametricos, A bayes-closed approximation of recursive non-linear estimation, Minimum dynamic discrimination information models, Mean Entropies, Unnamed Item, Statistical Evidence in Experiments and in Record Values, An argument-dependent approach to determining OWA operator weights based on the rule of maximum entropy, Strategies for inference robustness in focused modelling, BOLTZMANN–SHANNON ENTROPY: GENERALIZATION AND APPLICATION, Information Measures for Some Well-Known Families, An H-theorem for the general relativistic Ornstein-Uhlenbeck process, On entropy production for controlled Markovian evolution, Probability measures over fuzzy spaces, ON THE FAMILIES OF SOLUTIONS TO GENERALIZED MAXIMUM ENTROPY AND MINIMUM CROSS-ENTROPY PROBLEMS, A Theory of Information for Vague Concepts. Outline of Application to Approximate Reasoning, Model choice for prediction in generalized linear models, Information-theoretic approach to classifying operators in conveyor systems, Unnamed Item, Modelling of unexpected shift in SPC, Unnamed Item, Hamiltonian identification for quantum systems: well-posedness and numerical approaches, Joint additive Kullback–Leibler residual minimization and regularization for linear inverse problems, On identifiability of parametric statistical models, Unnamed Item, Unnamed Item, Optimal experimental control in econometrics: the simultaneous equation problem, Equivalence of parametric identifiability and estimability, Computational aspects ofl-projections, Unnamed Item, A design of single sampling plans by attributes based on the Kullback-Leibler information, Unnamed Item, Unnamed Item, Behavior of Two-Sample Rank Tests at Infinity, Mixed Theories of Information can be Derived by Using Shannon Information Only, Unnamed Item, New derivations of the maximum likelihood estimator and the likelihood ratio test, Unnamed Item, Principal component regression under exchangeability, RENYI ENTROPY OF MAPS: APPLICATIONS TO FUZZY SETS. PATTERN RECOGNITION, AND CHAOTIC DYNAMICS, On measures of relative information with preference, On shortest confidence intervals and their relation with uniformly minimum variance unbiased estimators, Measuring economic efficiency with stochastic input-output data†, Mixed strategy and information theory in optimal portfolio choice, UNCERTAINTY AND ESTIMATION IN RECONSTRUCTABILITY ANALYSIS, Mathematical techniques for quantum communication theory, Discriminant Variables, A note on solution of large sparse maximum entropy problems with linear equality constraints, M.D.I. estimation via unconstrained convex programming, Estimacion secuencial optima de una distribucion binomial tomando como perdida la divergencia funcional, Renewal theory and the sequential design of experiments with two states of nature, A class of statistics based on the information concept