Gradient descent with random initialization: fast global convergence for nonconvex phase retrieval

From MaRDI portal
Publication:2425162

DOI10.1007/S10107-019-01363-6zbMATH Open1415.90086arXiv1803.07726OpenAlexW3123272904WikidataQ128449469 ScholiaQ128449469MaRDI QIDQ2425162FDOQ2425162

Yuxin Chen, Jianqing Fan, Yuejie Chi, Cong Ma

Publication date: 26 June 2019

Published in: Mathematical Programming. Series A. Series B (Search for Journal in Brave)

Abstract: This paper considers the problem of solving systems of quadratic equations, namely, recovering an object of interest mathbfxaturalinmathbbRn from m quadratic equations/samples yi=(mathbfaiopmathbfxatural)2, 1leqileqm. This problem, also dubbed as phase retrieval, spans multiple domains including physical sciences and machine learning. We investigate the efficiency of gradient descent (or Wirtinger flow) designed for the nonconvex least squares problem. We prove that under Gaussian designs, gradient descent --- when randomly initialized --- yields an epsilon-accurate solution in iterations given nearly minimal samples, thus achieving near-optimal computational and sample complexities at once. This provides the first global convergence guarantee concerning vanilla gradient descent for phase retrieval, without the need of (i) carefully-designed initialization, (ii) sample splitting, or (iii) sophisticated saddle-point escaping schemes. All of these are achieved by exploiting the statistical models in analyzing optimization algorithms, via a leave-one-out approach that enables the decoupling of certain statistical dependency between the gradient descent iterates and the data.


Full work available at URL: https://arxiv.org/abs/1803.07726




Recommendations



Cites Work


Cited In (36)

Uses Software





This page was built for publication: Gradient descent with random initialization: fast global convergence for nonconvex phase retrieval

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2425162)