scientific article
From MaRDI portal
Publication:3260839
zbMath0088.10406MaRDI QIDQ3260839
Publication date: 1959
Title: zbMATH Open Web Interface contents unavailable due to conflicting licenses.
Introductory exposition (textbooks, tutorial papers, etc.) pertaining to statistics (62-01) Information theory (general) (94A15) Statistical aspects of information-theoretic topics (62B10)
Related Items
Information processes for semimartingale experiments, Class visualization of high-dimensional data with applications., Systemic risk measures, On entropy functionals of states of operator algebras, Game theory, maximum entropy, minimum discrepancy and robust Bayesian decision theory, The estimation of prior from Fisher information, The Kullback-Leibler risk of the Stein estimator and the conditional MLE, An information theoretic argument for the validity of the exponential model, An online prediction algorithm for reinforcement learning with linear function approximation using cross entropy method, Testing for nonlinearity using redundancies: Quantitative and qualitative aspects, New bounds for Shannon, relative and Mandelbrot entropies via Hermite interpolating polynomial, Minimum discrimination information estimator of the mean with known coefficient of variation, A maximum entropy approach to estimation and inference in dynamic models or Counting fish in the sea using maximum entropy, A Bayesian approach to diagnosis of asset pricing models, Information and probabilistic reasoning, Entropy of sums of random digits, Optimal hazard models based on partial information, Estimation and inference with censored and ordered multinomial response data, An \(R\)-squared measure of goodness of fit for some common nonlinear regression models, Characterizations of sum form information measures on open domains, Duality and equilibrium prices in economics of uncertainty, On the entropic regularization method for solving min-max problems with applications, Evaluating systemic risk using bank default probabilities in financial networks, A survey of some mathematical programming models in transportation, Asymptotics of the necessary sample size under small error probabilites, A measure concentration inequality for contracting Markov chains, On the phase space approach to complexity, Information theory and electron density, A finite volume scheme for boundary-driven convection-diffusion equations with relative entropy structure, Geometric characterization of Weyl's discrepancy norm in terms of its \(n\)-dimensional unit balls, Hypergraphs as a mean of discovering the dependence structure of a discrete multivariate probability distribution, Some properties of affinity and applications, Relative information functions and their type (\(\alpha, \beta\)) generalizations, Concavification of free entropy, Information-theoretic multiplicities of chemical bond in Shull's model of \(H_2\), Optimal measures and Markov transition kernels, Theoretical foundation for CMA-ES from information geometry perspective, Uniform-in-bandwidth consistency for kernel-type estimators of Shannon's entropy, Convergence rate of Markov chain methods for genomic motif discovery, Portfolio selection under model uncertainty: a penalized moment-based optimization approach, Simplification and hierarchical representations of mixtures of exponential families, Generalized maximum entropy estimation of dynamic programming models with sample selection bias, An optimal property of the exact multinomial test and the extended Fisher's exact test, Cross entropy minimization in uninvadable states of complex populations, Asymmetric Boltzmann machines, Global statistical information in exponential experiments and selection of exponential models, Information-based parameterization of the log-linear model for categorical data analysis, An \((R',S')\)-norm fuzzy relative information measure and its applications in strategic decision-making, Some new information measures for fuzzy sets, Characterization of the relative entropy of states of matrix algebras, Constructing elementary procedures for inference of the gamma distribution, Estimating a model through the conditional MLE, A Bayesian alternative to parametric hypothesis testing, Expectations and entropy inequalities for finite quantum systems, A comparative study of association measures, Completely positive maps and entropy inequalities, On information-improvement, A sequential theory of psychological discrimination, On some functional equations concerning entropy, directed divergence and inaccuracy, Explicativity, corroboration, and the relative odds of hypotheses. With comments by William L. Harper and John R. Wettersten, On \textit{phase}-equilibria in molecules, Quantum information descriptors and communications in molecules, Entropy of type \((\alpha,\beta)\) and other generalized measures in information theory, On generalized information function, Sub additive measures of relative information and inaccuracy, More accurate majorization inequalities obtained via superquadraticity and convexity with application to entropies, Chernoff distance for conditionally specified models, On axiomatic characterization of some non-additive measures of information, In between the \(LQG/H_2\)- and \(H_{\infty } \)-control theories, Tests of symmetry in three-way contingency tables, Note on generalized information function, Entropy-type inequalities for generalized gamma densities, A nonparametric statistical snake model using the gradient flow of minimum probability density integration, A new definition of cross-entropy for uncertain variables, Information theoretic framework for process control, Extremal exponents of random dynamical systems do not vanish, Bahadur efficiency and local asymptotic optimality of certain nonparametric tests for independence, A measure of discrimination between two residual life-time distributions and its applications, The Bonnet theorem for statistical manifolds, Composite likelihood methods: Rao-type tests based on composite minimum density power divergence estimator, Automatic aggregation of categories in multivariate contingency tables using information theory., Combining ranked mean value forecasts, Subjective estimation of the delay time distribution in maintenance modelling, On minimax detection of Gaussian stochastic sequences and Gaussian stationary signals, Estimation of best predictors of binary response, Discrimination distance bounds and statistical applications, Large deviations from the thermodynamic limit in globally coupled maps, Loglinear models and categorical data analysis with psychometric and econometric applications, A consistent nonparametric test for serial independence, Maximum likelihood procedure adapted to sampling schemes, The significance test controversy, Vines -- a new graphical model for dependent random variables., A generalized maxentropic inversion procedure for noisy data., Information and entropy econometrics -- editor's view., Information indices: Unification and applications., Connections between entropic and linear projections in asset pricing estimation, Limited information likelihood and Bayesian analysis, Information theoretic measures of the income distribution in food demand, Information transfer in continuous processes, Data compression and learning in time sequences analysis, On local divergences between two probability measures, Superstability of the functional equation related to distance measures, On convergence of conditional probability measures, Multi-choice linear programming for matrix game, Sherman's and related inequalities with applications in information theory, On entropy-continuity descriptors of molecular equilibrium states, Inference in a model of successive failures with shape-adjusted hazard rates, The application of the principle of minimum cross-entropy to the characterization of the exponential-type probability distributions, Model selection and Akaike's information criterion (AIC): The general theory and its analytical extensions, Stability, convergence to self-similarity and elastic limit for the Boltzmann equation for inelastic hard spheres, The role of duality in optimization problems involving entropy functionals with applications to information theory, Use of non-additive information measures in exploring molecular electronic structure: Stockholder bonded atoms and role of kinetic energy in the chemical bond, Many-orbital probabilities and their entropy/information descriptors in orbital communication theory of the chemical bond, Additive and non-additive information channels in orbital communication theory of the chemical bond, Entropy/information coupling between orbital-communications in molecular subsystems, Information gain and approaching true belief, Bounds for \(f\)-divergences under likelihood ratio constraints., Bayesian approach to thermostatistics, New computational model of an isotropic ``broken exponentially correlated random field, Properties of the relative entropy of states of von Neumann algebras, Information and statistics. I, An inexact accelerated proximal gradient method and a dual Newton-CG method for the maximal entropy problem, Synchronisation effects on the behavioural performance and information dynamics of a simulated minimally cognitive robotic agent, Performance evaluation of imputation based on Bayesian networks, Predictive efficiency for simple non-linear models, Quadratically constrained minimum cross-entropy analysis, An optimization principle for deriving nonequilibrium statistical models of Hamiltonian dynamics, Akaike's information criterion and Kullback-Leibler loss for histogram density estimation, A generalization of \(f\)-divergence measure to convex functions defined on linear spaces, An informational divergence geometry for stochastic matrices, An extension of the method of maximum likelihood and the Stein's problem, On probability flow descriptors in position and momentum spaces, An objective use of Bayesian models, Physical portrayal of computational complexity, Measuring the component overlapping in the Gaussian mixture model, Relative entropy between quantum ensembles, Measuring truthlikeness, Bootstrap-based model selection criteria for beta regressions, On the sample information about parameter and prediction, On length biased dynamic measure of past inaccuracy, Stochastic operators, information, and entropy, Can the maximum entropy principle be explained as a consistency requirement?, Information amount and higher-order efficiency in estimation, Misspecifying the shape of a random effects distribution: why getting it wrong may not matter, Relative log-concavity and a pair of triangle inequalities, Informational divergence and the dissimilarity of probability distributions, Basis set dependence of molecular information channels and their entropic bond descriptors, Finding transcription factor binding motifs for coregulated genes by combining sequence overrepresentation with cross-species conservation, On Kullback-Leibler information of order statistics in terms of the relative risk, On the construction of minimum information bivariate copula families, General two-place information functions, Modelling lymphoma therapy and outcome, Quantum information approach to electronic equilibria: molecular fragments and \textit{non}-equilibrium thermodynamic description, Multi-sample Rényi test statistics, Divergence-based tests of homogeneity for spatial data, Bayesian image superresolution and hidden variable modeling, Local comparison of linear rank tests, in the Bahadur sense, Entropy/information descriptors of the chemical bond revisited, All in action, Length biased weighted residual inaccuracy measure, Stochastic programming with random processes, Unsupervised weight parameter estimation method for ensemble learning, Behaviour of the Fokker-Planck-Boltzmann equation near a Maxwellian, Markovianness and conditional independence in annotated bacterial DNA, Application of data compression methods to nonparametric estimation of characteristics of discrete-time stochastic processes, Convergence properties of high-order Boltzmann machines, Estimation for the multi-way error components model with ill-conditioned panel data, On the effectiveness of Monte Carlo for initial uncertainty forecasting in nonlinear dynamical systems, Zipf-Mandelbrot law, \(f\)-divergences and the Jensen-type interpolating inequalities, Dynamic merge of the global and local models for sustainable land use planning with regard for global projections from GLOBIOM and local technical-economic feasibility and resource constraints, Towards a geometry of imprecise inference, Sequential category aggregation and partitioning approaches for multi-way contingency tables based on survey and census data, Asymptotic robustness study of the polychoric correlation estimation, Coherent infomax as a computational goal for neural systems, A dynamic measure of inaccuracy between two past lifetime distributions, Statistical models: conventional, penalized and hierarchical likelihood, Combinatorial entropies and statistics, Replicator equations and the principle of minimal production of information, Thermodynamic uncertainty relations and irreversibility, Effects of collinearity on information about regression coefficients, An alternate approach to pseudo-likelihood model selection in the generalized linear mixed modeling framework, Optimizing subsurface field data acquisition using information theory, Point estimation with exponentially tilted empirical likelihood, Asymptotic inference for semiparametric association models, Estimating cell probabilities in contingency tables with constraints on marginals/conditionals by geometric programming with applications, Clustering objects described by juxtaposition of binary data tables, A classification of the main probability distributions by minimizing the weighted logarithmic measure of deviation, Limiting values of large deviation probabilities of quadratic statistics, Between-group analysis with heterogeneous covariance matrices: The common principal component model, The formal definition of reference priors, Verteilungsmaße und Verteilungsindizes, An application of the discrimination information measure to the theory of testing hypotheses. I, Notion of information and independent component analysis., Stigler's approach to recovering the distribution of first significant digits in natural data sets, Percolation of entropy functionals on Cayley tree graphs as a method of order-disorder character diagnostics of complex structures, A quantitative Occam's razor, The principle of maximum entropy, A maximum entropy criterion of filtering and coding for stationary autoregressive signals: Its physical interpretations and suggestions for its application to neural information transmission, Information theoretic analysis for a general queueing system at equilibrium with application to queues in tandem, Extropy: complementary dual of entropy, On the Choice of Nonparametric Entropy Estimator in Entropy-Based Goodness-of-Fit Test Statistics, Unnamed Item, Robust downscaling approaches to disaggregation of data and projections under uncertainties: case of land cover and land use change systems, On entropy production for controlled Markovian evolution, Economical experiments: Bayesian efficient experimental design, Probability measures over fuzzy spaces, A remark on the convergence of Kullback-Leibler's mean information, An information-theoretic proof of Nash's inequality, Maximum L\(q\)-likelihood estimation, On Rényi entropies of order statistics, An Epistemological Approach to Steganography, Estimating Steganographic Fisher Information in Real Images, Parametric R-norm directed-divergence convex function, Probability distribution of wave velocity in heterogeneous media due to random phase configuration, Probability distribution of extreme values of wave velocity in stochastic heterogeneous media, Some information theoretic ideas useful in statistical inference, A GENERALIZED ENTROPY-BASED RESIDUAL LIFETIME DISTRIBUTIONS, Logistic Regression, a review, Approximation of continuous random variables for the evaluation of the reliability parameter of complex stress-strength models, Kullback–Leibler information of a censored variable and its applications, Unnamed Item, ON THE FAMILIES OF SOLUTIONS TO GENERALIZED MAXIMUM ENTROPY AND MINIMUM CROSS-ENTROPY PROBLEMS, Jensen-Renyi's-Tsallis fuzzy divergence information measure with its applications, Model-free data-driven inference in computational mechanics, Application of relative entropy theory to streamwise velocity profile in open-channel flow: effect of prior probability distributions, Asymptotic simplification of aggregation-diffusion equations towards the heat kernel, Information theory in living systems, methods, applications, and challenges, Equitability, mutual information, and the maximal information coefficient, On phases and interference of local communications in molecules, Communications in molecules: local and multi-configuration channels and their entropic descriptors of bond multiplicity and composition, On large time asymptotics for drift-diffusion-poisson systems, Stable simulation of fluid flow with high-Reynolds number using Ehrenfest's steps, An immune algorithm with stochastic aging and Kullback entropy for the chromatic number problem, INFORMATION AND PARTICLE PHYSICS, AXIOMATIC DERIVATION OF THE MUTUAL INFORMATION PRINCIPLE AS A METHOD OF INDUCTIVE INFERENCE, Information-statistical pattern based approach for data mining, ON CONVEX SOBOLEV INEQUALITIES AND THE RATE OF CONVERGENCE TO EQUILIBRIUM FOR FOKKER-PLANCK TYPE EQUATIONS, Minimum and Maximum Information Censoring Plans in Progressive Censoring, Rational approximations of spectral densities based on the Alpha divergence, Numerical method for estimating multivariate conditional distributions, Entropic descriptors of quantum communications in molecules, MAXIMUM ENTROPY REVISITED, Derivation of mixture distributions and weighted likelihood function as minimizers of KL-divergence subject to constraints, Game theoretical optimization inspired by information theory, An iterative procedure for general probability measures to obtain \(I\)-projections onto intersections of convex sets, Regularity properties and pathologies of position-space renormalization-group transformations: scope and limitations of Gibbsian theory, Kullback-Leibler divergence and mutual information of experiments in the fuzzy case, Differential entropy and dynamics of uncertainty, Limiting properties of some measures of information, Una nota sobre la cuantificacion de la incertidumbre correspondiente a las utilidades, A Survey of Reverse Inequalities for f-Divergence Measure in Information Theory, Analisis bayesiano de los contrastes de hipotesis parametricos, A bayes-closed approximation of recursive non-linear estimation, Minimum dynamic discrimination information models, Mean Entropies, Estimating the rate constant from biosensor data via an adaptive variational Bayesian approach, Two-Sided Generalized Exponential Distribution, On characterization of the Kullback-Leibler mean information for continuous probability distributions, Necessary conditions for the convergence of Kullback-Leibler's mean information, On the Kullback–Leibler information of hybrid censored data, On entropy of a Pareto distribution in the presence of outliers, Special feature: Information theory and statistics, Nonsymmetrical distance between probability distributions, entropy and the theorem of Pythagoras, Explanation, prediction, description, and information theory, Counter-Factual Reinforcement Learning: How to Model Decision-Makers That Anticipate the Future, Intraclass contingency tables, On the asymptotic distribution of the likelihood ratio under the regularity conditions due to Doob, Some models for individual-group comparisons and group behavior, On the solution of a functional inequality and its applications, On the f-divergence and singularity of probability measures, Information functionals with applications to random walk and statistics, A class of measures of informativity of observation channels, New Entropy Estimator with an Application to Test of Normality, Some limiting properties of Matusita's measure of distance, Some characterization theorems for generalized measures of uncertainty and information, The information in covariate imbalance in studies of hormone replacement therapy, On the concept of relative information, Conjugate predictive distributions and generalized entropies, Dualistic differential geometry of positive definite matrices and its applications to related problems, Clustering time series by linear dependency, Statistical Evidence in Experiments and in Record Values, An argument-dependent approach to determining OWA operator weights based on the rule of maximum entropy, Strategies for inference robustness in focused modelling, Beyond hybrid generative discriminative learning: spherical data classification, Robust estimators for one-shot device testing data under gamma lifetime model with an application to a tumor toxicological data, On divergence tests for composite hypotheses under composite likelihood, A correspondence principle for relative entropy minimization, BOLTZMANN–SHANNON ENTROPY: GENERALIZATION AND APPLICATION, TCMI: a non-parametric mutual-dependence estimator for multivariate continuous distributions, A model for learning cause-effect relationships in Bayesian networks, A maximum relative entropy principle for distribution of personal inc.Ome with derivations of several known inc.Ome distributions, Compromise between generalized bayes and bayes estimators of poisson means under entropy loss, Information Measures for Some Well-Known Families, Bhattacharyya distance based linear discriminant function for stationary time series, An H-theorem for the general relativistic Ornstein-Uhlenbeck process, Locally associated graphical models and mixed convex exponential families, A unified statistical approach to non-negative matrix factorization and probabilistic latent semantic indexing, On a general concept of forgetting, Resultant entropy/information, phase/entropy continuity and bond multiplicities in degenerate electronic states, Measuring the asymmetric contributions of individual subsystems, Information theoretic methods in small domain estimation, Asymptotically most informative procedure in the case of exponential families, Equivalence of the maximum likelihood estimator to a minimum entropy estimator, Sherman's inequality and its converse for strongly convex functions with applications to generalized f-divergences, Generalized maximum entropy estimation, A New Approach of Information Discrepancy to Analysis of Questionnaire Data, Unnamed Item, Estimation of entropies on time scales by Lidstone's interpolation using Csiszár-type functional, Relative efficiency of the Wald ${\rm SPRT}$ and the Chernoff information number, Interaction information in multivariate probability distributions, Mutual information as a measure of multivariate association: analytical properties and statistical estimation, A study of Rényi entropy based on the information geometry formalism, A unified sweep-stick mechanism to explain particle clustering in two- and three-dimensional homogeneous, isotropic turbulence, Conditional expectation in an operator algebra. IV. Entropy and information, On tests of symmetry, marginal homogeneity and quasi-symmetry in two-way contingency tables based on minimum φ-divergence estimator with constraints, Asymptotic behavior of sequential design with costs of experiments, DIC in variable selection, Sampling distributions associated with the multivariate t distribution, Analyzing local and global properties of multigraphs, The asymptotic distribution of information per unit cost concerning a linear hypothesis for means of two given normal populations, Some results for maximum likelihood estimation of adjusted relative risks, Statistical Problem Classes and Their Links to Information Theory, Nearly Optimal Static Las Vegas Succinct Dictionary, Axiomatic characterizations of some measures of divergence in information, Dimension Reduction with Linear Discriminant Functions Based on an Odds Ratio Parameterization, Class Discovery via Nonnegative Matrix Factorization, Testing hypotheses for Markov chains when the parameter space is finite, Asymtotic confidence regions and likelihood ratio tests of hypothesis for location and scale parameters based on type II censored samples, Rates of the strong uniform consistency for the kernel-type regression function estimators with general kernels on manifolds, SSUE: Simultaneous state and uncertainty estimation for dynamical systems, On Shannon's entropy, directed divergence and inaccuracy, User-friendly Introduction to PAC-Bayes Bounds, Unnamed Item, Generalizations of cyclic refinements of Jensen's inequality by Lidstone's polynomial with applications in information theory, Some results on quantile version of Rényi entropy of order statistics, Models and Software for Urban and Regional Transportation Planning: The Contributions of the Center for Research on Transportation, On reconsidering entropies and divergences and their cumulative counterparts: Csiszár's, DPD's and Fisher's type cumulative and survival measures, On the variable bandwidth kernel estimation of conditional \(U\)-statistics at optimal rates in sup-norm, Evaluation of the Kullback‐Leibler Discrepancy for Model Selection in Open Population Capture‐Recapture Models, Hypothesis testing for arbitrarily varying source, Unnamed Item, Unnamed Item, Estimating Field-Level Rotations as Dynamic Cycles, Enhancing Productivity and Market Access for Key Staples in the EAC Region: An Economic Analysis of Biophysical and Market Potential, Censored Kullback-Leibler Information and Goodness-of-Fit Test with Type II Censored Data, Normalization of the origin-shifted exponential distribution for control chart construction, Data disaggregation procedures within a maximum entropy framework, An approach to setting up a national customer satisfaction index: the Jordan case study, COMPLEXITY AS A MEASURE OF THE DIFFICULTY OF SYSTEM DIAGNOSIS, Non-Maxwellian kinetic equations modeling the dynamics of wealth distribution, Estimation of cost allocation coefficients at the farm level using an entropy approach, EEG Data Space Adaptation to Reduce Intersession Nonstationarity in Brain-Computer Interface, Coding Accuracy Is Not Fully Determined by the Neuronal Model, Financial portfolios based on Tsallis relative entropy as the risk measure, A Quasi-Likelihood Approach to Nonnegative Matrix Factorization, On Nonnegative Matrix Factorization Algorithms for Signal-Dependent Noise with Application to Electromyography Data, Maximum Likelihood Estimation of Regularization Parameters in High-Dimensional Inverse Problems: An Empirical Bayesian Approach. Part II: Theoretical Analysis, DISTANCE BETWEEN PHYSICAL THEORIES BASED ON INFORMATION THEORY, Deep Reinforcement Learning: A State-of-the-Art Walkthrough, An Information Criterion for Choosing Observation Locations in Data Assimilation and Prediction, DISTINGUISHING ATTACKS ON BLOCK CIPHERS BY DIFFERENTIALS OF TWO-BLOCK TEXTS, Generalized interpolation in 𝐻^{∞} with a complexity constraint, Unnamed Item, Multiple Objects: Error Exponents in Hypotheses Testing and Identification, QUANTUM MECHANICS AND PATTERN RECOGNITION, UNCOVERING SHORT-TIME CORRELATIONS BETWEEN MULTICHANNEL RECORDINGS OF BRAIN ACTIVITY: A PHASE-SPACE APPROACH, Prediction of time series by statistical learning: general losses and fast rates, A simultaneous estimation and variable selection rule, Determination and interpretation of preferred orientation with texture goniometry: An application of indicators to maximum entropy pole- to orientation-density inversion, Testing goodness-of-fit for Laplace distribution based on maximum entropy, Unnamed Item, AN OLD‐NEW CONCEPT OF CONVEX RISK MEASURES: THE OPTIMIZED CERTAINTY EQUIVALENT, A new class of random vector entropy estimators and its applications in testing statistical hypotheses, A Viable Alternative to Resorting to Statistical Tables, Gain-loss pricing under ambiguity of measure, A GENERALIZATION OF SHANNON'S INFORMATION THEORY, Maximum entropy and Bayesian approaches to the ratio problem, Optimality, entropy and complexity for nonextensive quantum scattering, Unnamed Item, New bounds for Shannon, relative and Mandelbrot entropies via Abel-Gontscharoff interpolating polynomial, Converse to the Sherman inequality with applications, Monotonicity of the Jensen functional for f-divergences with applications to the Zipf-Mandelbrot law, On a Jensen-type inequality for generalized f-divergences and Zipf-Mandelbrot law, Artificial sequences and complexity measures, Entropy-based goodness-of-fit tests for the Pareto I distribution, Covariate selection for accelerated failure time data, Projections of probability measures, APPLICATIONS OF DENSITY MATRICES IN A TRAPPED BOSE GAS, Generalized Csiszár's f-divergence for Lipschitzian functions, On interactions in contingency tables, A Formal Semantics of Influence in Bayesian Reasoning, Mixed Solution Strategy for MCGDM Problems Using Entropy/Cross Entropy in Interval-Valued Intuitionistic Fuzzy Environment, Some Functional Equations Related to the Characterizations of Information Measures and Their Stability, Quantum collapse rules from the maximum relative entropy principle, Asymmetry of Risk and Value of Information, Interval estimation: An information theoretic approach, Asymptotic behavior of sequential design with costs of experiments. (The case of normal distribution), A Modified Akaike Criterion for Model Choice in Generalized Linear Models, Resolving hypotheses with successive chisquares, On minimax detection of Gaussian stochastic sequences with imprecisely known means and covariance matrices, Some estimations of the Jensen difference and applications, Approaching probabilistic laws, Extropy based inaccuracy measure in order statistics, Data-driven games in computational mechanics, On The Principle of Minimum interdependence, Some strategies for mastermind, An entropic framework for the normal distribution in capability analysis, COMPUTERIZED METHODOLOGY FOR THE EVALUATION OF LEVEL OF KNOWLEDGE, On a generalized directed-divergence function, Metricas riemanianas asociadas a M-divergencias, The information for the direction of dependence in l1regression, A Justification for Applying the Principle of Minimum Relative Entropy to Information Integration Problems, Unnamed Item, Unnamed Item, Unnamed Item, On the classification of observations structured into groups, Behavior of Two-Sample Rank Tests at Infinity, Mixed Theories of Information can be Derived by Using Shannon Information Only, Tracking control of non-linear stochastic systems by using path cross-entropy and Fokker-Planck equation, Asymptotic Confidence Intervals for the Relative Relapse Rate Under Random Censorship, Efficiency in the Use of Hotellings T2, A Theory of Information for Vague Concepts. Outline of Application to Approximate Reasoning, Model choice for prediction in generalized linear models, Information-theoretic approach to classifying operators in conveyor systems, Unnamed Item, New derivations of the maximum likelihood estimator and the likelihood ratio test, Unnamed Item, Unnamed Item, USING THE MUTUAL INFORMATION COEFFICIENT TO IDENTIFY LAGS IN NONLINEAR MODELS, Information theoretic multivariate graduation, Unnamed Item, Goodness-of-fit tests based on Verma Kullback–Leibler information, A modified likelihood ratio test for the mean direction in the von mises distribution, On the choice of a discrepancy functional for model selection, Kullback-leibler information approach to the optimum measurement point for bayesian estimation, Information gain when measuring an unknown qubit, Principal component regression under exchangeability, Minimally informative distributions with given rank correlation for use in uncertainty analysis, Post-processing techniques for the joint CEC/USNRC uncertainty analysis of accident consequence codes, RENYI ENTROPY OF MAPS: APPLICATIONS TO FUZZY SETS. PATTERN RECOGNITION, AND CHAOTIC DYNAMICS, On $3$-dimensional interaction information, A consistency algorithm based on information theory, On generalized measures of relative information and inaccuracy, On measures of relative information with preference, SOME NEW PROPERTIES OF HELLINGER DISTANCE FOR VALIDATING APPROXIMATIONS IN BAYESIAN ANALYSIS, Superresolution in the maximum entropy approach to invert Laplace transforms, On shortest confidence intervals and their relation with uniformly minimum variance unbiased estimators, Entropy, information flow and variance in regulatory control systems, Unnamed Item, Measuring economic efficiency with stochastic input-output data†, Unnamed Item, A Convergent Iterative Procedure for Constructing Bivariate Distributions, Mixed strategy and information theory in optimal portfolio choice, UNCERTAINTY AND ESTIMATION IN RECONSTRUCTABILITY ANALYSIS, Riemannian and Finslerian geometry in thermodynamics, A General Class of Estimators for the Linear Regression Model Affected by Collinearity and Outliers, Mathematical techniques for quantum communication theory, Unnamed Item, Generalized log-likelihood functions and Bregman divergences, Overview and construction of meshfree basis functions: from moving least squares to entropy approximants, Application of the method of incremental coefficients (MIC) algorithm to inertial systems, An Algebraic Implicitization and Specialization of Minimum KL-Divergence Models, Modelling of unexpected shift in SPC, Unnamed Item, Hamiltonian identification for quantum systems: well-posedness and numerical approaches, Joint additive Kullback–Leibler residual minimization and regularization for linear inverse problems, Invariants of the Markov process by the transformation of variables, Discriminant Variables, On identifiability of parametric statistical models, Synthesis of input signals in parameter identification in static systems, A note on solution of large sparse maximum entropy problems with linear equality constraints, Information analysis of linear interactions in contingency tables, Loss-based optimal control statistics for control charts, A stepwise discrete variable selection procedure, The minimum discrimination information approach in analyzing categorical data, M.D.I. estimation via unconstrained convex programming, Estimacion secuencial optima de una distribucion binomial tomando como perdida la divergencia funcional, Renewal theory and the sequential design of experiments with two states of nature, Information Theory, Relative Entropy and Statistics, A class of statistics based on the information concept, Optimal Unconditional Asymptotic Test in 2 × 2 Multinomial Trials, A Simulation Study to Investigate the Behavior of the Log-Density Ratio Under Normality, Unnamed Item, Unnamed Item, Entropy in linear programs, A multi-run interactive method for bicriterion optimization problems, Bayesian clustering of data sets, Unnamed Item, Computer classification of the EEG time series by Kullback information measure, Rigorous Derivation of a Nonlinear Diffusion Equation as Fast-Reaction Limit of a Continuous Coagulation-Fragmentation Model with Diffusion, RECONSTRUCTABILITY ANALYSIS: Overview and Bibliography†, Unnamed Item, Über eine Klasse von Informationsmaßen für die Bewertung stochastischer (partieller) Informationen, A NEW ALGORITHM FOR ESTIMATING THE RISK OF NATURAL DISASTERS WITH INCOMPLETE DATA, A new approach to goodness-of-fit testing based on the integrated empirical process*, Implications of Form Invariance to the Structure of Nonextensive Entropies, Input design for linear dynamic systems using maxmin criteria, Optimal experimental control in econometrics: the simultaneous equation problem, Equivalence of parametric identifiability and estimability, The structure of indices of social mobility and inheritance, Computational aspects ofl-projections, Unnamed Item, A design of single sampling plans by attributes based on the Kullback-Leibler information, An alternative bayesian approach to the multivariate behrens-fisher problem