Learning factor graphs in polynomial time and sample complexity
From MaRDI portal
Publication:3174020
Recommendations
- Large-sample learning of Bayesian networks is NP-hard
- The sample complexity of learning fixed-structure Bayesian networks
- Efficiently learning Ising models on arbitrary graphs (extended abstract)
- Parameterized complexity results for exact Bayesian network structure learning
- Learning loopy graphical models with latent variables: efficient methods and guarantees
Cited in
(15)- Sparse model selection in the highly under-sampled regime
- Probabilistic graphical models and Markov networks
- Computational implications of reducing data to sufficient statistics
- Convergence theorems of estimation of distribution algorithms
- Learning a tree-structured Ising model in order to make predictions
- The minimax learning rates of normal and Ising undirected graphical models
- A review of message passing algorithms in estimation of distribution algorithms
- Learning loosely connected Markov random fields
- Efficient learning of discrete graphical models*
- Near-Optimal Learning of Tree-Structured Distributions by Chow and Liu
- High-dimensional structure estimation in Ising models: local separation criterion
- High-dimensional Ising model selection using \(\ell _{1}\)-regularized logistic regression
- A factor graph based genetic algorithm
- Neural assemblies revealed by inferred connectivity-based models of prefrontal cortex recordings
- Piecewise training for structured prediction
This page was built for publication: Learning factor graphs in polynomial time and sample complexity
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q3174020)