A Proximal Point Dual Newton Algorithm for Solving Group Graphical Lasso Problems
From MaRDI portal
Publication:5116554
DOI10.1137/19M1267830zbMATH Open1448.90096arXiv1906.04647OpenAlexW3048513222MaRDI QIDQ5116554FDOQ5116554
Kim-Chuan Toh, Yangjing Zhang, Defeng Sun, Ning Zhang
Publication date: 18 August 2020
Published in: SIAM Journal on Optimization (Search for Journal in Brave)
Abstract: Undirected graphical models have been especially popular for learning the conditional independence structure among a large number of variables where the observations are drawn independently and identically from the same distribution. However, many modern statistical problems would involve categorical data or time-varying data, which might follow different but related underlying distributions. In order to learn a collection of related graphical models simultaneously, various joint graphical models inducing sparsity in graphs and similarity across graphs have been proposed. In this paper, we aim to propose an implementable proximal point dual Newton algorithm (PPDNA) for solving the group graphical Lasso model, which encourages a shared pattern of sparsity across graphs. Though the group graphical Lasso regularizer is non-polyhedral, the asymptotic superlinear convergence of our proposed method PPDNA can be obtained by leveraging on the local Lipschitz continuity of the Karush-Kuhn-Tucker solution mapping associated with the group graphical Lasso model. A variety of numerical experiments on real data sets illustrates that the PPDNA for solving the group graphical Lasso model can be highly efficient and robust.
Full work available at URL: https://arxiv.org/abs/1906.04647
Analysis of variance and covariance (ANOVA) (62J10) Programming involving graphs or networks (90C35)
Cites Work
- Title not available (Why is that?)
- Title not available (Why is that?)
- Title not available (Why is that?)
- Title not available (Why is that?)
- Title not available (Why is that?)
- Fast Algorithms for Large-Scale Generalized Distance Weighted Discrimination
- Variational Analysis
- Model Selection and Estimation in Regression with Grouped Variables
- Sparse inverse covariance estimation with the graphical lasso
- Sparse permutation invariant covariance estimation
- Gradient directed regularization for sparse Gaussian concentration graphs, with applications to inference of genetic networks
- A Newton-CG Augmented Lagrangian Method for Semidefinite Programming
- An augmented Lagrangian approach for sparse principal component analysis
- The Joint Graphical Lasso for Inverse Covariance Estimation Across Multiple Classes
- Solving Log-Determinant Optimization Problems by a Newton-CG Primal Proximal Point Algorithm
- Sparse Reconstruction by Separable Approximation
- Some continuity properties of polyhedral multifunctions
- Monotone Operators and the Proximal Point Algorithm
- Augmented Lagrangians and Applications of the Proximal Point Algorithm in Convex Programming
- Proximité et dualité dans un espace hilbertien
- Functional Analysis
- A proximal point algorithm for log-determinant optimization with group Lasso regularization
- Asymptotic Convergence Analysis of the Proximal Point Algorithm
- Fused Multiple Graphical Lasso
- The Strong Second-Order Sufficient Condition and Constraint Nondegeneracy in Nonlinear Semidefinite Programming and Their Implications
- Solving variational inequality problems via smoothing-nonsmooth reformulations
- On the R-superlinear convergence of the KKT residuals generated by the augmented Lagrangian method for convex composite conic programming
- A family of inexact SQA methods for non-smooth convex minimization with provable convergence guarantees based on the Luo-Tseng error bound property
- Large-scale Sparse Inverse Covariance Matrix Estimation
- On Efficiently Solving the Subproblems of a Level-Set Method for Fused Lasso Problems
- An efficient Hessian based algorithm for solving large-scale sparse group Lasso problems
- Efficient Sparse Semismooth Newton Methods for the Clustered Lasso Problem
Cited In (6)
- An efficient algorithm for Fantope-constrained sparse principal subspace estimation problem
- An Efficient Linearly Convergent Regularized Proximal Point Algorithm for Fused Multiple Graphical Lasso Problems
- Sparse precision matrix estimation with missing observations
- Efficient Sparse Hessian-Based Semismooth Newton Algorithms for Dantzig Selector
- A dual semismooth Newton based augmented Lagrangian method for large-scale linearly constrained sparse group square-root Lasso problems
- Variational low-light image enhancement based on fractional-order differential
Uses Software
This page was built for publication: A Proximal Point Dual Newton Algorithm for Solving Group Graphical Lasso Problems
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q5116554)