10.1162/jmlr.2003.4.7-8.1177
DOI10.1162/JMLR.2003.4.7-8.1177zbMATH Open1061.62096OpenAlexW2136305079MaRDI QIDQ4669447FDOQ4669447
Authors: Jean-François Cardoso
Publication date: 15 April 2005
Published in: CrossRef Listing of Deleted DOIs (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1162/jmlr.2003.4.7-8.1177
Recommendations
- Independent component analysis in the presence of Gaussian noise by maximizing joint likelihood
- scientific article; zbMATH DE number 1233949
- scientific article; zbMATH DE number 2018607
- Independent component analysis: principles and practice
- Independent component analysis via nonparametric maximum likelihood estimation
- Quantifying identifiability in independent component analysis
- Independent Component Analysis and Blind Signal Separation
- Mean-Field Approaches to Independent Component Analysis
- Independent component analysis: recent advances
non-Gaussianitymutual informationminimum entropyinformation geometrysource separationcumulant expansions
Statistical aspects of information-theoretic topics (62B10) Multivariate analysis (62H99) Artificial intelligence (68T99)
Cited In (22)
- Recognizing and visualizing departures from independence in bivariate data using local Gaussian correlation
- Independent component analysis in the light of information geometry
- Title not available (Why is that?)
- Quantifying identifiability in independent component analysis
- Nonlinear Extraction of Independent Components of Natural Images Using Radial Gaussianization
- Investigation on the skewness for independent component analysis
- Information geometry on hierarchy of probability distributions
- An assessment of Hermite function based approximations of mutual information applied to independent component analysis
- Gene feature interference deconvolution
- PMOG: The projected mixture of Gaussians model with application to blind source separation
- Title not available (Why is that?)
- Entropy embedding and fluctuation analysis in genomic manifolds
- Simultaneous estimation of nongaussian components and their correlation structure
- Cross-cumulants measure for independence
- Information Theory, Relative Entropy and Statistics
- Minimax Mutual Information Approach for Independent Component Analysis
- Spatio-chromatic information available from different neural layers via gaussianization
- Modelling sequences using pairwise relational features
- Independent component analysis: recent advances
- Exploring nonlinear dynamics in brain functionality through phase portraits and fuzzy recurrence plots
- Spectral independent component analysis
- Gaussian lower bound for the information bottleneck limit
This page was built for publication: 10.1162/jmlr.2003.4.7-8.1177
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q4669447)