Information Theoretical Analysis of Multivariate Correlation
From MaRDI portal
Publication:3276988
DOI10.1147/RD.41.0066zbMATH Open0097.35003DBLPjournals/ibmrd/Watanabe60OpenAlexW2095439994WikidataQ56454253 ScholiaQ56454253MaRDI QIDQ3276988FDOQ3276988
Authors: Satosi Watanabe
Publication date: 1960
Published in: IBM Journal of Research and Development (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1147/rd.41.0066
Cited In (59)
- Dependency reduction with divisive normalization: justification and effectiveness
- TCMI: a non-parametric mutual-dependence estimator for multivariate continuous distributions
- Information-gain computation in the \textsc{Fifth} system
- The effects of system -- environment correlations on heat transport and quantum entanglement via collision models
- Partially ordered permutation complexity of coupled time series
- A multivariate approach to the symmetrical uncertainty measure: application to feature selection problem
- Measure concentration and the weak Pinsker property
- Title not available (Why is that?)
- RECONSTRUCTABILITY ANALYSIS: Overview and Bibliography†
- The stochastic thermodynamics of computation
- Effective feature construction by maximum common subgraph sampling
- Factorized mutual information maximization
- Information integration from distributed threshold-based interactions
- Multi-variate correlation and mixtures of product measures
- Large-sample asymptotic approximations for the sampling and posterior distributions of differential entropy for multivariate normal distributions
- Continuity and additivity properties of information decompositions
- Title not available (Why is that?)
- Anatomy of a bit: Information in a time series observation
- A multivariate extension of mutual information for growing neural networks
- The Evolution of Representation in Simple Cognitive Networks
- The correlational entropy production during the local relaxation in a many body system with Ising interactions
- Title not available (Why is that?)
- Entropic Trust Region for Densest Crystallographic Symmetry Group Packings
- Information of interactions in complex systems
- Generalized Linear Mixed Models Based on Latent Markov Heterogeneity Structures
- Evolution of holographic entropy quantities for composite quantum systems
- Decomposition criteria for the design of complex systems
- Exponential decay of pairwise correlation in Gaussian graphical models with an equicorrelational one-dimensional connection pattern
- Close-to-optimal continuity bound for the von Neumann entropy and other quasi-classical applications of the Alicki-Fannes-Winter technique
- Spatio-chromatic information available from different neural layers via gaussianization
- DEFT: distilling entangled factors by preventing information diffusion
- Response improvement in complex experiments by co-information composite likelihood optimization
- Cumulants of multiinformation density in the case of a multivariate normal distribution
- Reliable clustering of Bernoulli mixture models
- A Causal Perspective on the Analysis of Signal and Noise Correlations and Their Role in Population Coding
- Ergotropy from quantum and classical correlations
- Higher-Order Description of Brain Function
- Some comments in connection with Rozeboom's linear correlation theory
- Informational coverage and correlational analysis
- The theory of abstract partials: An introduction
- Spurious correlation as an approximation of the mutual information between redundant outputs and an unknown input
- Causal inference for multivariate stochastic process prediction
- Squashed entanglement and approximate private states
- Synergy, redundancy, and multivariate information measures: an experimentalist's perspective
- Mutual information analysis: a comprehensive study
- Generating Spike Trains with Specified Correlation Coefficients
- A measure of statistical complexity based on predictive information with application to finite spin systems
- Linear correlations between sets of variables
- MEASURES OF UNCERTAINTY AND INFORMATION BASED ON POSSIBILITY DISTRIBUTIONS
- Universal Features for High-Dimensional Learning and Inference
- Information thermodynamics of encoding and encoders
- Minimum information dependence modeling
- Evolving higher-order synergies reveals a trade-off between stability and information-integration capacity in complex systems
- Smooth min-entropy lower bounds for approximation chains
- The Berkelmans–Pries dependency function: A generic measure of dependence between random variables
- \texttt{fastMI}: a fast and consistent copula-based nonparametric estimator of mutual information
- Exploring nonlinear dynamics in brain functionality through phase portraits and fuzzy recurrence plots
- Family of quantum mutual information in multiparty quantum systems
- Probing multipartite entanglement through persistent homology
This page was built for publication: Information Theoretical Analysis of Multivariate Correlation
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q3276988)