Sparse model selection in the highly under-sampled regime
From MaRDI portal
Publication:3302832
DOI10.1088/1742-5468/2016/09/093404zbMATH Open1456.91129arXiv1603.00952OpenAlexW2290456206MaRDI QIDQ3302832FDOQ3302832
Authors: Nicola Bulso, Yasser Roudi, Matteo Marsili
Publication date: 11 August 2020
Published in: Journal of Statistical Mechanics: Theory and Experiment (Search for Journal in Brave)
Abstract: We propose a method for recovering the structure of a sparse undirected graphical model when very few samples are available. The method decides about the presence or absence of bonds between pairs of variable by considering one pair at a time and using a closed form formula, analytically derived by calculating the posterior probability for every possible model explaining a two body system using Jeffreys prior. The approach does not rely on the optimisation of any cost functions and consequently is much faster than existing algorithms. Despite this time and computational advantage, numerical results show that for several sparse topologies the algorithm is comparable to the best existing algorithms, and is more accurate in the presence of hidden variables. We apply this approach to the analysis of US stock market data and to neural data, in order to show its efficiency in recovering robust statistical dependencies in real data with non stationary correlations in time and space.
Full work available at URL: https://arxiv.org/abs/1603.00952
Recommendations
- Model selection through sparse maximum likelihood estimation for multivariate Gaussian or binary data
- Sparse inverse covariance estimation with the graphical lasso
- Bayesian structure learning in sparse Gaussian graphical models
- High-dimensional graphs and variable selection with the Lasso
- Bayesian model selection with graph structured sparsity
Cites Work
- Estimating the dimension of a model
- Title not available (Why is that?)
- A new look at the statistical model identification
- Estimation of sparse binary pairwise Markov networks using pseudo-likelihoods
- Approximating discrete probability distributions with dependence trees
- High-dimensional Ising model selection using \(\ell _{1}\)-regularized logistic regression
- High-dimensional structure estimation in Ising models: local separation criterion
- High-dimensional Ising model selection with Bayesian information criteria
- An invariant form for the prior probability in estimation problems
- Compressed sensing
- Universal coding, information, prediction, and estimation
- Fisher information and stochastic complexity
- Counting probability distributions: Differential geometry and model selection
- Information, Physics, and Computation
- Title not available (Why is that?)
- Theory of Financial Risk and Derivative Pricing
- Expectation consistent approximate inference
- Dynamic instability in generic model of multi-assets markets
- Learning factor graphs in polynomial time and sample complexity
- Improving Markov network structure learning using decision trees
- Efficient Markov Network Structure Discovery Using Independence Tests
- Reconstruction of Markov random fields from samples: some observations and algorithms
- Belief propagation and replicas for inference and learning in a kinetic Ising model with hidden spins
Cited In (5)
This page was built for publication: Sparse model selection in the highly under-sampled regime
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q3302832)