Joint dependence distribution of data set using optimizing Tsallis copula entropy
From MaRDI portal
Publication:2163100
DOI10.1016/J.PHYSA.2019.121897OpenAlexW2953826057WikidataQ127612774 ScholiaQ127612774MaRDI QIDQ2163100
Gholam Reza Mohtashami Borzadaran, Seyedeh Azadeh Fallah Mortezanejad, Bahram Sadeghpour-Gildeh
Publication date: 10 August 2022
Published in: Physica A (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/j.physa.2019.121897
Cites Work
- Unnamed Item
- Unnamed Item
- A Mathematical Theory of Communication
- Mutual information is copula entropy
- Copulas with maximum entropy
- A copula entropy approach to correlation measurement at the country level
- On nonparametric measures of dependence for random variables
- Possible generalization of Boltzmann-Gibbs statistics.
- Information Theory and Statistical Mechanics
- Ordinal Measures of Association
- Axiomatic derivation of the principle of maximum entropy and the principle of minimum cross-entropy
- Extension of darmois-skitcvic theorem to functions of random variables satisfying an addition theorem
This page was built for publication: Joint dependence distribution of data set using optimizing Tsallis copula entropy