Gradient descent with random initialization: fast global convergence for nonconvex phase retrieval
From MaRDI portal
Publication:2425162
DOI10.1007/S10107-019-01363-6zbMATH Open1415.90086arXiv1803.07726OpenAlexW3123272904WikidataQ128449469 ScholiaQ128449469MaRDI QIDQ2425162FDOQ2425162
Yuxin Chen, Jianqing Fan, Yuejie Chi, Cong Ma
Publication date: 26 June 2019
Published in: Mathematical Programming. Series A. Series B (Search for Journal in Brave)
Abstract: This paper considers the problem of solving systems of quadratic equations, namely, recovering an object of interest from quadratic equations/samples , . This problem, also dubbed as phase retrieval, spans multiple domains including physical sciences and machine learning. We investigate the efficiency of gradient descent (or Wirtinger flow) designed for the nonconvex least squares problem. We prove that under Gaussian designs, gradient descent --- when randomly initialized --- yields an -accurate solution in iterations given nearly minimal samples, thus achieving near-optimal computational and sample complexities at once. This provides the first global convergence guarantee concerning vanilla gradient descent for phase retrieval, without the need of (i) carefully-designed initialization, (ii) sample splitting, or (iii) sophisticated saddle-point escaping schemes. All of these are achieved by exploiting the statistical models in analyzing optimization algorithms, via a leave-one-out approach that enables the decoupling of certain statistical dependency between the gradient descent iterates and the data.
Full work available at URL: https://arxiv.org/abs/1803.07726
Recommendations
- On global convergence of gradient descent algorithms for generalized phase retrieval problem
- Phase Retrieval by Alternating Minimization With Random Initialization
- Convolutional Phase Retrieval via Gradient Descent
- Scalable incremental nonconvex optimization approach for phase retrieval
- Solving phase retrieval with random initial guess is nearly as good as by spectral initialization
- Implicit regularization in nonconvex statistical estimation: gradient descent converges linearly for phase retrieval, matrix completion, and blind deconvolution
- Variational phase retrieval with globally convergent preconditioned proximal algorithm
- On the Global Convergence of Randomized Coordinate Gradient Descent for Nonconvex Optimization
- Phase retrieval via randomized Kaczmarz: theoretical guarantees
- PhaseEqual: Convex Phase Retrieval via Alternating Direction Method of Multipliers
Cites Work
- Phaselift: exact and stable signal recovery from magnitude measurements via convex programming
- Phase Retrieval via Wirtinger Flow: Theory and Algorithms
- Phase retrieval via matrix completion
- Entrywise eigenvector analysis of random matrices with low expected rank
- Matrix Completion From a Few Entries
- ROP: matrix recovery via rank-one projections
- Title not available (Why is that?)
- Saving phase: injectivity and stability for phase retrieval
- Stable optimizationless recovery from phaseless linear measurements
- Solving quadratic equations via phaselift when there are about as many equations as unknowns
- Cubic regularization of Newton method and its global performance
- On robust regression with high-dimensional predictors
- Optimal rates of convergence for noisy sparse phase retrieval via thresholded Wirtinger flow
- Exact and Stable Covariance Estimation From Quadratic Sampling via Convex Programming
- Low rank matrix recovery from rank one measurements
- Rapid, robust, and reliable blind deconvolution via nonconvex optimization
- Concentration and Moment Inequalities for Polynomials of Independent Random Variables
- Guaranteed Matrix Completion via Non-Convex Factorization
- On the impact of predictor geometry on the performance on high-dimensional ridge-regularized generalized robust regression estimators
- Solving Systems of Random Quadratic Equations via Truncated Amplitude Flow
- Structured Signal Recovery From Quadratic Measurements: Breaking Sample Complexity Barriers via Nonconvex Optimization
- The Projected Power Method: An Efficient Algorithm for Joint Alignment from Pairwise Differences
- Nonconvex Optimization Meets Low-Rank Matrix Factorization: An Overview
- Solving Random Quadratic Systems of Equations Is Nearly as Easy as Solving Linear Systems
- Phase retrieval with one or two diffraction patterns by alternating projections with the null initialization
- Fast rank-one alternating minimization algorithm for phase retrieval
- Solving systems of phaseless equations via Kaczmarz methods: a proof of concept study
- Phase Retrieval Using Alternating Minimization
- Theoretical Insights Into the Optimization Landscape of Over-Parameterized Shallow Neural Networks
- Non-Convex Phase Retrieval From STFT Measurements
- Title not available (Why is that?)
- Convolutional Phase Retrieval via Gradient Descent
- Solving (most) of a set of quadratic equalities: composite optimization for robust phase retrieval
- Implicit regularization in nonconvex statistical estimation: gradient descent converges linearly for phase retrieval, matrix completion, and blind deconvolution
- The likelihood ratio test in high-dimensional logistic regression is asymptotically a rescaled Chi-square
- Fundamental limits of weak recovery with applications to phase retrieval
- Near-Optimal Bounds for Phase Synchronization
- Phase retrieval via randomized Kaczmarz: theoretical guarantees
- Spectral method and regularized MLE are both optimal for top-\(K\) ranking
- Misspecified nonconvex statistical optimization for sparse phase retrieval
- Asymmetry helps: eigenvalue and eigenvector analyses of asymmetrically perturbed low-rank matrices
- Sparse and Low-Rank Tensor Estimation via Cubic Sketchings
- Optimization-Based AMP for Phase Retrieval: The Impact of Initialization and $\ell_{2}$ Regularization
- Nonconvex Matrix Factorization From Rank-One Measurements
- Revisiting Normalized Gradient Descent: Fast Evasion of Saddle Points
- Blind Deconvolution by a Steepest Descent Algorithm on a Quotient Manifold
- Compressive Phase Retrieval via Reweighted Amplitude Flow
Cited In (36)
- Nearly optimal bounds for the global geometric landscape of phase retrieval
- The numerics of phase retrieval
- Title not available (Why is that?)
- Anisotropic Diffusion in Consensus-Based Optimization on the Sphere
- Complex phase retrieval from subgaussian measurements
- Low-rank matrix recovery with composite optimization: good conditioning and rapid convergence
- Implicit regularization in nonconvex statistical estimation: gradient descent converges linearly for phase retrieval, matrix completion, and blind deconvolution
- Asymptotic Properties of Stationary Solutions of Coupled Nonconvex Nonsmooth Empirical Risk Minimization
- On the Convergence of Mirror Descent beyond Stochastic Convex Programming
- Median-Truncated Gradient Descent: A Robust and Scalable Nonconvex Approach for Signal Estimation
- Fast gradient method for low-rank matrix estimation
- Title not available (Why is that?)
- Exact Recovery of Multichannel Sparse Blind Deconvolution via Gradient Descent
- Sharp global convergence guarantees for iterative nonconvex optimization with random data
- Provable Phase Retrieval with Mirror Descent
- Role of sparsity and structure in the optimization landscape of non-convex matrix sensing
- Tensor factorization recommender systems with dependency
- Near-optimal bounds for generalized orthogonal Procrustes problem via generalized power method
- A new complexity metric for nonconvex rank-one generalized matrix completion
- Title not available (Why is that?)
- A selective overview of deep learning
- Multicategory Angle-Based Learning for Estimating Optimal Dynamic Treatment Regimes With Censored Data
- Spectral method and regularized MLE are both optimal for top-\(K\) ranking
- Sparse signal recovery from phaseless measurements via hard thresholding pursuit
- Convolutional Phase Retrieval via Gradient Descent
- Consensus-based optimization on hypersurfaces: Well-posedness and mean-field limit
- Nonconvex Low-Rank Tensor Completion from Noisy Data
- Bridging convex and nonconvex optimization in robust PCA: noise, outliers and missing data
- Revisiting Landscape Analysis in Deep Neural Networks: Eliminating Decreasing Paths to Infinity
- Convex and Nonconvex Optimization Are Both Minimax-Optimal for Noisy Blind Deconvolution Under Random Designs
- Recent Theoretical Advances in Non-Convex Optimization
- Noisy Matrix Completion: Understanding Statistical Guarantees for Convex Relaxation via Nonconvex Optimization
- On the geometric analysis of a quartic-quadratic optimization problem under a spherical constraint
- Inference for heteroskedastic PCA with missing data
- Homomorphic sensing of subspace arrangements
- Randomly initialized EM algorithm for two-component Gaussian mixture achieves near optimality in \(O(\sqrt{n})\) iterations
Uses Software
This page was built for publication: Gradient descent with random initialization: fast global convergence for nonconvex phase retrieval
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2425162)