A proximal point dual Newton algorithm for solving group graphical Lasso problems
From MaRDI portal
Publication:5116554
Abstract: Undirected graphical models have been especially popular for learning the conditional independence structure among a large number of variables where the observations are drawn independently and identically from the same distribution. However, many modern statistical problems would involve categorical data or time-varying data, which might follow different but related underlying distributions. In order to learn a collection of related graphical models simultaneously, various joint graphical models inducing sparsity in graphs and similarity across graphs have been proposed. In this paper, we aim to propose an implementable proximal point dual Newton algorithm (PPDNA) for solving the group graphical Lasso model, which encourages a shared pattern of sparsity across graphs. Though the group graphical Lasso regularizer is non-polyhedral, the asymptotic superlinear convergence of our proposed method PPDNA can be obtained by leveraging on the local Lipschitz continuity of the Karush-Kuhn-Tucker solution mapping associated with the group graphical Lasso model. A variety of numerical experiments on real data sets illustrates that the PPDNA for solving the group graphical Lasso model can be highly efficient and robust.
Recommendations
- An efficient linearly convergent regularized proximal point algorithm for fused multiple graphical Lasso problems
- A proximal point algorithm for log-determinant optimization with group Lasso regularization
- Sparse inverse covariance estimation with the graphical lasso
- An efficient Hessian based algorithm for solving large-scale sparse group Lasso problems
- The graphical lasso: new insights and alternatives
Cites work
- scientific article; zbMATH DE number 46303 (Why is no real title available?)
- scientific article; zbMATH DE number 1113627 (Why is no real title available?)
- scientific article; zbMATH DE number 1502618 (Why is no real title available?)
- A Newton-CG augmented Lagrangian method for semidefinite programming
- A family of inexact SQA methods for non-smooth convex minimization with provable convergence guarantees based on the Luo-Tseng error bound property
- A proximal point algorithm for log-determinant optimization with group Lasso regularization
- An augmented Lagrangian approach for sparse principal component analysis
- An efficient Hessian based algorithm for solving large-scale sparse group Lasso problems
- Asymptotic Convergence Analysis of the Proximal Point Algorithm
- Augmented Lagrangians and Applications of the Proximal Point Algorithm in Convex Programming
- Efficient sparse semismooth Newton methods for the clustered Lasso problem
- Fast Algorithms for Large-Scale Generalized Distance Weighted Discrimination
- Functional Analysis
- Fused multiple graphical lasso
- Gradient directed regularization for sparse Gaussian concentration graphs, with applications to inference of genetic networks
- Large-scale sparse inverse covariance matrix estimation
- Model Selection and Estimation in Regression with Grouped Variables
- Model selection through sparse maximum likelihood estimation for multivariate Gaussian or binary data
- Monotone Operators and the Proximal Point Algorithm
- On efficiently solving the subproblems of a level-set method for fused lasso problems
- On the R-superlinear convergence of the KKT residuals generated by the augmented Lagrangian method for convex composite conic programming
- Proximité et dualité dans un espace hilbertien
- QUIC: quadratic approximation for sparse inverse covariance estimation
- Solving log-determinant optimization problems by a Newton-CG primal proximal point algorithm
- Solving variational inequality problems via smoothing-nonsmooth reformulations
- Some continuity properties of polyhedral multifunctions
- Sparse Reconstruction by Separable Approximation
- Sparse inverse covariance estimation with the graphical lasso
- Sparse permutation invariant covariance estimation
- The Joint Graphical Lasso for Inverse Covariance Estimation Across Multiple Classes
- The Strong Second-Order Sufficient Condition and Constraint Nondegeneracy in Nonlinear Semidefinite Programming and Their Implications
- Variational Analysis
Cited in
(8)- A proximal point algorithm for log-determinant optimization with group Lasso regularization
- A dual semismooth Newton based augmented Lagrangian method for large-scale linearly constrained sparse group square-root Lasso problems
- An efficient algorithm for Fantope-constrained sparse principal subspace estimation problem
- Variational low-light image enhancement based on fractional-order differential
- An efficient linearly convergent regularized proximal point algorithm for fused multiple graphical Lasso problems
- Gradient projection Newton algorithm for sparse collaborative learning using synthetic and real datasets of applications
- Efficient sparse Hessian-based semismooth Newton algorithms for Dantzig selector
- Sparse precision matrix estimation with missing observations
This page was built for publication: A proximal point dual Newton algorithm for solving group graphical Lasso problems
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q5116554)