Learning factor graphs in polynomial time and sample complexity
From MaRDI portal
Publication:3174020
zbMATH Open1222.68128arXiv1207.1366MaRDI QIDQ3174020FDOQ3174020
Andrew Y. Ng, Daphne Koller, Pieter Abbeel
Publication date: 12 October 2011
Full work available at URL: https://arxiv.org/abs/1207.1366
Recommendations
- Large-sample learning of Bayesian networks is NP-hard
- The sample complexity of learning fixed-structure Bayesian networks
- Efficiently learning Ising models on arbitrary graphs (extended abstract)
- Parameterized complexity results for exact Bayesian network structure learning
- Learning loopy graphical models with latent variables: efficient methods and guarantees
Bayesian networksprobabilistic graphical modelsMarkov networksfactor graphsparameter and structure learning
Learning and adaptive systems in artificial intelligence (68T05) Graphical methods in statistics (62A09)
Cited In (12)
- A review of message passing algorithms in estimation of distribution algorithms
- Piecewise training for structured prediction
- Convergence Theorems of Estimation of Distribution Algorithms
- Computational implications of reducing data to sufficient statistics
- Probabilistic Graphical Models and Markov Networks
- A factor graph based genetic algorithm
- Sparse model selection in the highly under-sampled regime
- Near-Optimal Learning of Tree-Structured Distributions by Chow and Liu
- Neural assemblies revealed by inferred connectivity-based models of prefrontal cortex recordings
- Learning a tree-structured Ising model in order to make predictions
- High-dimensional structure estimation in Ising models: local separation criterion
- High-dimensional Ising model selection using \(\ell _{1}\)-regularized logistic regression
This page was built for publication: Learning factor graphs in polynomial time and sample complexity
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q3174020)