Calculation of the amount of information about a random function contained in another such function

From MaRDI portal
Publication:3258155

DOI10.1090/trans2/012/09zbMath0087.13201OpenAlexW4241085363MaRDI QIDQ3258155

Akiva M. Yaglom, Israel M. Gel'fand

Publication date: 1959

Published in: Eleven Papers on Analysis, Probability and Topology (Search for Journal in Brave)

Full work available at URL: https://doi.org/10.1090/trans2/012/09



Related Items

Probabilistic regularization of Fredholm integral equations of the first kind, Canonical analysis relative to a closed subspace, Data processing using information theory functionals, Data processing using information theory functionals, Information-based long-range dependence, Mutual information in the frequency domain, Calculation of the Shannon information, Information and filtering, Exact dimension of Furstenberg measures, Quantum mutual entropy defined by liftings, Some aspects of quantum information theory and their applications to irreversible processes, Metric and probabilistic information associated with Fredholm integral equations of the first kind, On some properties of Gaussian channels, Quantum algorithm for SAT problem andquantum mutual entropy, The general theory of canonical correlation and its relation to functional analysis, Block coherence: a method for measuring the interdependence between two blocks of neurobiological time series, An extension of the application of information theory to forecasting, Gelfand-Yaglom-Perez theorem for generalized relative entropy functionals, A new treatment of communication processes with Gaussian channels, Continuity of entropy and mutual entropy in C*-dynamical systems, Measures of association for Hilbertian subspaces and some applications, Gaussian channels and the optimal coding, On Quantum Capacity and its Bound, On the spectral formulation of Granger causality, A method for approximate representation of vector-valued time series and its relation to two alternatives, Local independence of fractional Brownian motion, Model reduction by phase matching, Capacity of mismatched Gaussian channels with and without feedback, The decomposition and measurement of the interdependency between second- order stationary processes, On the concept of relative information, The Capacity of the White Gaussian Noise Channel, Mutual information in Gaussian channels, Entropy and dimension of disintegrations of stationary measures, Canonical correlations of past inputs and future outputs for linear stochastic systems, On Wiener-Granger causality, information and canonical correlation, Modelling the dynamics of nonlinear time series using canonical variate analysis, Spectral factor reduction by phase matching: the continuous-time single-input single-output case†