A Riemannian gossip approach to subspace learning on Grassmann manifold
From MaRDI portal
Publication:2320541
Abstract: In this paper, we focus on subspace learning problems on the Grassmann manifold. Interesting applications in this setting include low-rank matrix completion and low-dimensional multivariate regression, among others. Motivated by privacy concerns, we aim to solve such problems in a decentralized setting where multiple agents have access to (and solve) only a part of the whole optimization problem. The agents communicate with each other to arrive at a consensus, i.e., agree on a common quantity, via the gossip protocol. We propose a novel cost function for subspace learning on the Grassmann manifold, which is a weighted sum of several sub-problems (each solved by an agent) and the communication cost among the agents. The cost function has a finite sum structure. In the proposed modeling approach, different agents learn individual local subspace but they achieve asymptotic consensus on the global learned subspace. The approach is scalable and parallelizable. Numerical experiments show the efficacy of the proposed decentralized algorithms on various matrix completion and multivariate regression benchmarks.
Recommendations
- Riemannian geometry of Grassmann manifolds with a view on algorithmic computation
- Dictionary Learning on Grassmann Manifolds
- Local convergence of an algorithm for subspace identification from partial data
- Low-rank matrix completion by Riemannian optimization
- Decentralized and privacy-preserving low-rank matrix completion
Cites work
- scientific article; zbMATH DE number 5957307 (Why is no real title available?)
- scientific article; zbMATH DE number 5454133 (Why is no real title available?)
- scientific article; zbMATH DE number 1448976 (Why is no real title available?)
- scientific article; zbMATH DE number 5223994 (Why is no real title available?)
- A Bayesian/information theoretic model of learning to learn via multiple task sampling
- A Geometric Approach to Low-Rank Matrix Completion
- A Singular Value Thresholding Algorithm for Matrix Completion
- A new approach to collaborative filtering: operator estimation with spectral regularization
- Consensus optimization on manifolds
- Convex multi-task feature learning
- Decentralized and privacy-preserving low-rank matrix completion
- Dictionary Learning on Grassmann Manifolds
- Exact matrix completion via convex optimization
- Fixed-rank matrix factorizations and Riemannian low-rank optimization
- Flexible latent variable models for multi-task learning
- Gossip algorithms
- Kernels for vector-valued functions: a review
- Learning multiple tasks with kernel methods
- Learning using privileged information: SVM+ and weighted SVM
- Low-rank matrix completion via preconditioned optimization on the Grassmann manifold
- Manopt, a Matlab toolbox for optimization on manifolds
- Matrix Completion From a Few Entries
- Model Selection and Estimation in Regression with Grouped Variables
- Parallel stochastic gradient algorithms for large-scale matrix completion
- Pattern recognition and machine learning.
- Regression on fixed-rank positive semidefinite matrices: a Riemannian approach
- Riemannian Consensus for Manifolds With Bounded Curvature
- Riemannian Stochastic Variance Reduced Gradient Algorithm with Retraction and Vector Transport
- Solving a low-rank factorization model for matrix completion by a nonlinear successive over-relaxation algorithm
- Stochastic Gradient Descent on Riemannian Manifolds
- Structured low-rank approximation with missing data
- Subspace Evolution and Transfer (SET) for Low-Rank Matrix Completion
- The Geometry of Algorithms with Orthogonality Constraints
Cited in
(4)
This page was built for publication: A Riemannian gossip approach to subspace learning on Grassmann manifold
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2320541)