Efficient Relaxations for Dense CRFs with Sparse Higher-Order Potentials
From MaRDI portal
Publication:5236643
Abstract: Dense conditional random fields (CRFs) have become a popular framework for modelling several problems in computer vision such as stereo correspondence and multi-class semantic segmentation. By modelling long-range interactions, dense CRFs provide a labelling that captures finer detail than their sparse counterparts. Currently, the state-of-the-art algorithm performs mean-field inference using a filter-based method but fails to provide a strong theoretical guarantee on the quality of the solution. A question naturally arises as to whether it is possible to obtain a maximum a posteriori (MAP) estimate of a dense CRF using a principled method. Within this paper, we show that this is indeed possible. We will show that, by using a filter-based method, continuous relaxations of the MAP problem can be optimised efficiently using state-of-the-art algorithms. Specifically, we will solve a quadratic programming (QP) relaxation using the Frank-Wolfe algorithm and a linear programming (LP) relaxation by developing a proximal minimisation framework. By exploiting labelling consistency in the higher-order potentials and utilising the filter-based method, we are able to formulate the above algorithms such that each iteration has a complexity linear in the number of classes and random variables. The presented algorithms can be applied to any labelling problem using a dense CRF with sparse higher-order potentials. In this paper, we use semantic segmentation as an example application as it demonstrates the ability of the algorithm to scale to dense CRFs with large dimensions. We perform experiments on the Pascal dataset to indicate that the presented algorithms are able to attain lower energies than the mean-field inference method.
Recommendations
- Efficient robust conditional random fields
- scientific article; zbMATH DE number 6378100
- Discriminative Training of Deep Fully Connected Continuous CRFs With Task-Specific Loss
- Exploiting structural consistencies with stacked conditional random fields
- Exponentiated gradient algorithms for conditional random fields and max-margin Markov networks
- Large margin cost-sensitive learning of conditional random fields
Cites work
- scientific article; zbMATH DE number 3479763 (Why is no real title available?)
- An analysis of convex relaxations for MAP estimation of discrete MRFs
- Approximation algorithms for classification problems with pairwise relationships, metric labeling and Markov random fields
- Approximation algorithms for the metric labeling problem via a new linear programming formulation
- Duality between subgradient and conditional gradient methods
- Efficient Relaxations for Dense CRFs with Sparse Higher-Order Potentials
- Filter-based mean-field inference for random fields with higher-order terms and product label-spaces
- Inference methods for CRFs with co-occurrence statistics
- Nonparametric guidance of autoencoder representations using label information
- Probabilistic graphical models.
Cited in
(8)- On learning conditional random fields for stereo
- Revisiting Deep Structured Models for Pixel-Level Labeling with Gradient-Based Inference
- Tighter continuous relaxations for MAP inference in discrete MRFs: a survey
- Harmony potentials fusing global and local scale for semantic image segmentation
- A doubly graduated method for inference in Markov random field
- Discriminative training of conditional random fields with probably submodular constraints
- Efficient Relaxations for Dense CRFs with Sparse Higher-Order Potentials
- Discriminative Training of Deep Fully Connected Continuous CRFs With Task-Specific Loss
This page was built for publication: Efficient Relaxations for Dense CRFs with Sparse Higher-Order Potentials
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q5236643)