High-dimensional causal discovery under non-Gaussianity
From MaRDI portal
Publication:5222218
Abstract: We consider graphical models based on a recursive system of linear structural equations. This implies that there is an ordering, , of the variables such that each observed variable is a linear function of a variable specific error term and the other observed variables with . The causal relationships, i.e., which other variables the linear functions depend on, can be described using a directed graph. It has been previously shown that when the variable specific error terms are non-Gaussian, the exact causal graph, as opposed to a Markov equivalence class, can be consistently estimated from observational data. We propose an algorithm that yields consistent estimates of the graph also in high-dimensional settings in which the number of variables may grow at a faster rate than the number of observations, but in which the underlying causal structure features suitable sparsity; specifically, the maximum in-degree of the graph is controlled. Our theoretical analysis is couched in the setting of log-concave error distributions.
Recommendations
- Learning high-dimensional Gaussian linear structural equation models with heterogeneous error variances
- DirectLiNGAM: a direct method for learning a linear non-Gaussian structural equation model
- CAM: causal additive models, high-dimensional order search and penalized regression
- On causal discovery with an equal-variance assumption
- High-dimensional learning of linear causal networks via inverse covariance estimation
Cited in
(19)- Causal discovery in heavy-tailed models
- Learning high-dimensional Gaussian linear structural equation models with heterogeneous error variances
- Locally robust inference for non-Gaussian SVAR models
- Higher-Order Least Squares: Assessing Partial Goodness of Fit of Linear Causal Models
- The costs and benefits of uniformly valid causal inference with high-dimensional nuisance parameters
- Estimating bounds on causal effects in high-dimensional and possibly confounded systems
- High-dimensional learning of linear causal networks via inverse covariance estimation
- Identifiability of additive noise models using conditional variances
- Causal Discovery via Reproducing Kernel Hilbert Space Embeddings
- Multi-Trek Separation in Linear Structural Equation Models
- DirectLiNGAM: a direct method for learning a linear non-Gaussian structural equation model
- Estimating exogenous variables in data with more variables than observations
- Score-based causal learning in additive noise models
- Causal structure learning: a combinatorial perspective
- A causal discovery algorithm based on the prior selection of leaf nodes
- Efficient Learning of Quadratic Variance Function Directed Acyclic Graphs via Topological Layers
- Efficient learning of nonparametric directed acyclic graph with statistical guarantee
- Order-independent constraint-based causal structure learning
- On causal discovery with an equal-variance assumption
This page was built for publication: High-dimensional causal discovery under non-Gaussianity
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q5222218)