Inference and learning in probabilistic logic programs using weighted Boolean formulas

From MaRDI portal
Publication:4592983

DOI10.1017/S1471068414000076zbMath1379.68062arXiv1304.6810MaRDI QIDQ4592983

Dimitar Shterionov, Joris Renkens, Ingo Thon, Bernd Gutmann, Gerda Janssens, Daan Fierens, Guy Van den Broeck, Luc De Raedt

Publication date: 9 November 2017

Published in: Theory and Practice of Logic Programming (Search for Journal in Brave)

Full work available at URL: https://arxiv.org/abs/1304.6810




Related Items (51)

Efficient Knowledge Compilation Beyond Weighted Model CountingSymbolic DNN-tunerModel checking finite-horizon Markov chains with probabilistic inferenceCompacting Boolean Formulae for Inference in Probabilistic Logic ProgrammingA survey of lifted inference approaches for probabilistic logic programming under the distribution semanticsExploiting local and repeated structure in dynamic Bayesian networksDiffusion centrality: a paradigm to maximize spread in social networks\(T_{\mathcal{P}}\)-compilation for inference in probabilistic logic programsProbabilistic abductive logic programming using Dirichlet priorsA functional account of probabilistic programming with possible worlds. Declarative pearlExplanations as programs in probabilistic logic programmingHandling epistemic and aleatory uncertainties in probabilistic circuitsExact stochastic constraint optimisation with applications in network analysisConnecting Width and Structure in Knowledge Compilation (Extended Version)Computing LPMLN using ASP and MLN solversKnowledge compilation of logic programs using approximation fixpoint theoryGenerating random instances of weighted model counting. An empirical analysis with varying primal treewidthProbabilistic (logic) programming conceptsBandit-based Monte-Carlo structure learning of probabilistic logic programsLifted inference with tree axiomsDisjunctive delimited controlStatistical relational extension of answer set programmingUnified decomposition-aggregation (UDA) rules: dynamic, schematic, novel axiomsRule Induction and Reasoning over Knowledge GraphsOnline event recognition over noisy data streamsIASCAR: incremental answer set counting by anytime refinementThe joy of probabilistic answer set programming: semantics, complexity, expressivity, inferenceThirty years of credal networks: specification, algorithms and complexityThe complexity of Bayesian networks specified by propositional and relational languagesLifted Bayesian Filtering in Multiset Rewriting SystemsMAP Inference for Probabilistic Logic ProgrammingSome thoughts on knowledge-enhanced machine learningNeural probabilistic logic programming in DeepProbLogOn the complexity of propositional and relational credal networksSpeeding up parameter and rule learning for acyclic probabilistic logic programsUsing SWISH to Realize Interactive Web-based Tutorials for Logic-based LanguagesThe finite model theory of Bayesian network specifications: descriptive complexity and zero/one lawsProbabilistic sentence satisfiability: an approach to PSATProbabilistic abstract argumentation frameworks, a possible world viewComplexity results for probabilistic answer set programmingPredictive spreadsheet autocompletion with constraintsConfidences for commonsense reasoningP-log: refinement and a new coherency conditionLearning hierarchical probabilistic logic programsAdvanced SMT techniques for weighted model integrationConnecting knowledge compilation classes and width parametersApproximate weighted model integration on DNF structuresA taxonomy of weight learning methods for statistical relational learningOptimizing Probabilities in Probabilistic Logic ProgramsUtilizing Treewidth for Quantitative Reasoning on Epistemic Logic ProgramsWeighted model counting without parameter variables


Uses Software


Cites Work


This page was built for publication: Inference and learning in probabilistic logic programs using weighted Boolean formulas