Calculation of the amount of information about a random function contained in another such function
From MaRDI portal
Publication:3258155
DOI10.1090/TRANS2/012/09zbMATH Open0087.13201OpenAlexW4241085363MaRDI QIDQ3258155FDOQ3258155
Authors: Akiva M. Yaglom, I. M. Gel'fand
Publication date: 1959
Published in: Eleven Papers on Analysis, Probability and Topology (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1090/trans2/012/09
Cited In (37)
- Canonical analysis relative to a closed subspace
- Mutual information in Gaussian channels
- Exact dimension of Furstenberg measures
- Measures of association for Hilbertian subspaces and some applications
- Probabilistic regularization of Fredholm integral equations of the first kind
- The decomposition and measurement of the interdependency between second- order stationary processes
- Capacity of mismatched Gaussian channels with and without feedback
- Continuity of entropy and mutual entropy in C*-dynamical systems
- Data processing using information theory functionals
- Data processing using information theory functionals
- The general theory of canonical correlation and its relation to functional analysis
- On Quantum Capacity and its Bound
- Information and filtering
- On the spectral formulation of Granger causality
- A method for approximate representation of vector-valued time series and its relation to two alternatives
- Some aspects of quantum information theory and their applications to irreversible processes
- Metric and probabilistic information associated with Fredholm integral equations of the first kind
- Calculation of the Shannon information
- Quantum algorithm for SAT problem andquantum mutual entropy
- An extension of the application of information theory to forecasting
- Local independence of fractional Brownian motion
- Information-based long-range dependence
- Gaussian channels and the optimal coding
- Mutual information in the frequency domain
- Entropy and dimension of disintegrations of stationary measures
- Canonical correlations of past inputs and future outputs for linear stochastic systems
- A new treatment of communication processes with Gaussian channels
- Spectral factor reduction by phase matching: the continuous-time single-input single-output case†
- Gelfand-Yaglom-Perez theorem for generalized relative entropy functionals
- Model reduction by phase matching
- Block coherence: a method for measuring the interdependence between two blocks of neurobiological time series
- The Capacity of the White Gaussian Noise Channel
- On some properties of Gaussian channels
- On Wiener-Granger causality, information and canonical correlation
- On the concept of relative information
- Quantum mutual entropy defined by liftings
- Modelling the dynamics of nonlinear time series using canonical variate analysis
This page was built for publication: Calculation of the amount of information about a random function contained in another such function
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q3258155)