A de Bruijn's identity for dependent random variables based on copula theory
From MaRDI portal
Publication:5358071
DOI10.1017/S0269964815000315zbMATH Open1370.94379MaRDI QIDQ5358071FDOQ5358071
Authors: Nayereh Bagheri Khoolenjani, Mohammad Hossein Alamatsaz
Publication date: 19 September 2017
Published in: Probability in the Engineering and Informational Sciences (Search for Journal in Brave)
Recommendations
- Extension of de Bruijn's identity to dependent non-Gaussian noise channels
- scientific article; zbMATH DE number 7653414
- deBruijn identities: From Shannon, Kullback-Leibler and Fisher to generalized φ-entropies, φ-divergences and φ-Fisher informations
- Entropy flow and de Bruijn's identity for a class of stochastic differential equations driven by fractional Brownian motion
- INEQUALITIES FOR THE DEPENDENT GAUSSIAN NOISE CHANNELS BASED ON FISHER INFORMATION AND COPULAS
Cites Work
- Elements of Information Theory
- Title not available (Why is that?)
- Title not available (Why is that?)
- Title not available (Why is that?)
- An introduction to copulas.
- A Mathematical Theory of Communication
- Title not available (Why is that?)
- A new entropy power inequality
- Information Theoretic Proofs of Entropy Power Inequalities
- Some inequalities satisfied by the quantities of information of Fisher and Shannon
- Mutual Information and Minimum Mean-Square Error in Gaussian Channels
- Gradient of mutual information in linear vector Gaussian channels
- The Capacity Region of the Gaussian Multiple-Input Multiple-Output Broadcast Channel
- The heat equation and Stein's identity: connections, applications
- MULTIVARIATE DISPERSION ORDER AND THE NOTION OF COPULA APPLIED TO THE MULTIVARIATE t-DISTRIBUTION
- Some new results on Rényi entropy of residual life and inactivity time
- A simple converse for broadcast channels with additive white Gaussian noise (Corresp.)
- On Stein's identity and its applications
- On the Equivalence Between Stein and De Bruijn Identities
- Hessian and Concavity of Mutual Information, Differential Entropy, and Entropy Power in Linear Vector Gaussian Channels
- An Extremal Inequality Motivated by Multiterminal Information-Theoretic Problems
Cited In (5)
- Entropy flow and de Bruijn's identity for a class of stochastic differential equations driven by fractional Brownian motion
- Title not available (Why is that?)
- Extension of de Bruijn's identity to dependent non-Gaussian noise channels
- INEQUALITIES FOR THE DEPENDENT GAUSSIAN NOISE CHANNELS BASED ON FISHER INFORMATION AND COPULAS
- An alternative proof for the minimum Fisher information of Gaussian distribution
This page was built for publication: A de Bruijn's identity for dependent random variables based on copula theory
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q5358071)