A Riemannian gossip approach to subspace learning on Grassmann manifold

From MaRDI portal
Publication:2320541

DOI10.1007/S10994-018-05775-XzbMATH Open1493.68309arXiv1705.00467OpenAlexW2790379190WikidataQ115381966 ScholiaQ115381966MaRDI QIDQ2320541FDOQ2320541


Authors: Bamdev Mishra, Hiroyuki Kasai, Pratik Jawanpuria, Atul Saroop Edit this on Wikidata


Publication date: 23 August 2019

Published in: Machine Learning (Search for Journal in Brave)

Abstract: In this paper, we focus on subspace learning problems on the Grassmann manifold. Interesting applications in this setting include low-rank matrix completion and low-dimensional multivariate regression, among others. Motivated by privacy concerns, we aim to solve such problems in a decentralized setting where multiple agents have access to (and solve) only a part of the whole optimization problem. The agents communicate with each other to arrive at a consensus, i.e., agree on a common quantity, via the gossip protocol. We propose a novel cost function for subspace learning on the Grassmann manifold, which is a weighted sum of several sub-problems (each solved by an agent) and the communication cost among the agents. The cost function has a finite sum structure. In the proposed modeling approach, different agents learn individual local subspace but they achieve asymptotic consensus on the global learned subspace. The approach is scalable and parallelizable. Numerical experiments show the efficacy of the proposed decentralized algorithms on various matrix completion and multivariate regression benchmarks.


Full work available at URL: https://arxiv.org/abs/1705.00467




Recommendations




Cites Work


Cited In (4)

Uses Software





This page was built for publication: A Riemannian gossip approach to subspace learning on Grassmann manifold

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2320541)