Norm-Product Belief Propagation: Primal-Dual Message-Passing for Approximate Inference
From MaRDI portal
Publication:5281180
DOI10.1109/TIT.2010.2079014zbMATH Open1366.94105arXiv0903.3127OpenAlexW2130697955MaRDI QIDQ5281180FDOQ5281180
Authors: Tamir Hazan, Amnon Shashua
Publication date: 27 July 2017
Published in: IEEE Transactions on Information Theory (Search for Journal in Brave)
Abstract: In this paper we treat both forms of probabilistic inference, estimating marginal probabilities of the joint distribution and finding the most probable assignment, through a unified message-passing algorithm architecture. We generalize the Belief Propagation (BP) algorithms of sum-product and max-product and tree-rewaighted (TRW) sum and max product algorithms (TRBP) and introduce a new set of convergent algorithms based on "convex-free-energy" and Linear-Programming (LP) relaxation as a zero-temprature of a convex-free-energy. The main idea of this work arises from taking a general perspective on the existing BP and TRBP algorithms while observing that they all are reductions from the basic optimization formula of where the function is an extended-valued, strictly convex but non-smooth and the functions are extended-valued functions (not necessarily convex). We use tools from convex duality to present the "primal-dual ascent" algorithm which is an extension of the Bregman successive projection scheme and is designed to handle optimization of the general type . Mapping the fractional-free-energy variational principle to this framework introduces the "norm-product" message-passing. Special cases include sum-product and max-product (BP algorithms) and the TRBP algorithms. When the fractional-free-energy is set to be convex (convex-free-energy) the norm-product is globally convergent for estimating of marginal probabilities and for approximating the LP-relaxation. We also introduce another branch of the norm-product, the "convex-max-product". The convex-max-product is convergent (unlike max-product) and aims at solving the LP-relaxation.
Full work available at URL: https://arxiv.org/abs/0903.3127
Convex programming (90C25) Signal theory (characterization, reconstruction, filtering, etc.) (94A12) Graphical methods in statistics (62A09)
Cited In (7)
- Blending learning and inference in conditional random fields
- Energy distribution view for monotonic dual decomposition
- Improving probabilistic inference in graphical models with determinism and cycles
- Image labeling based on graphical models using Wasserstein messages and geometric assignment
- Title not available (Why is that?)
- Margin losses for training conditional random fields
- Efficient semidefinite branch-and-cut for MAP-MRF inference
This page was built for publication: Norm-Product Belief Propagation: Primal-Dual Message-Passing for Approximate Inference
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q5281180)