Gradient descent with random initialization: fast global convergence for nonconvex phase retrieval
From MaRDI portal
Publication:2425162
DOI10.1007/S10107-019-01363-6zbMath1415.90086arXiv1803.07726OpenAlexW3123272904WikidataQ128449469 ScholiaQ128449469MaRDI QIDQ2425162
Yuxin Chen, Yuejie Chi, Cong Ma, Jianqing Fan
Publication date: 26 June 2019
Published in: Mathematical Programming. Series A. Series B (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1803.07726
Related Items (32)
Multicategory Angle-Based Learning for Estimating Optimal Dynamic Treatment Regimes With Censored Data ⋮ The numerics of phase retrieval ⋮ Role of sparsity and structure in the optimization landscape of non-convex matrix sensing ⋮ Tensor factorization recommender systems with dependency ⋮ Revisiting Landscape Analysis in Deep Neural Networks: Eliminating Decreasing Paths to Infinity ⋮ Nonconvex Low-Rank Tensor Completion from Noisy Data ⋮ Sparse signal recovery from phaseless measurements via hard thresholding pursuit ⋮ Anisotropic Diffusion in Consensus-Based Optimization on the Sphere ⋮ Sharp global convergence guarantees for iterative nonconvex optimization with random data ⋮ Implicit regularization in nonconvex statistical estimation: gradient descent converges linearly for phase retrieval, matrix completion, and blind deconvolution ⋮ Convex and Nonconvex Optimization Are Both Minimax-Optimal for Noisy Blind Deconvolution Under Random Designs ⋮ Fast gradient method for low-rank matrix estimation ⋮ Unnamed Item ⋮ Provable Phase Retrieval with Mirror Descent ⋮ Near-optimal bounds for generalized orthogonal Procrustes problem via generalized power method ⋮ Noisy Matrix Completion: Understanding Statistical Guarantees for Convex Relaxation via Nonconvex Optimization ⋮ Recent Theoretical Advances in Non-Convex Optimization ⋮ Complex phase retrieval from subgaussian measurements ⋮ Median-Truncated Gradient Descent: A Robust and Scalable Nonconvex Approach for Signal Estimation ⋮ Exact Recovery of Multichannel Sparse Blind Deconvolution via Gradient Descent ⋮ Homomorphic sensing of subspace arrangements ⋮ Unnamed Item ⋮ A selective overview of deep learning ⋮ Bridging convex and nonconvex optimization in robust PCA: noise, outliers and missing data ⋮ Consensus-based optimization on hypersurfaces: Well-posedness and mean-field limit ⋮ Spectral method and regularized MLE are both optimal for top-\(K\) ranking ⋮ Low-rank matrix recovery with composite optimization: good conditioning and rapid convergence ⋮ On the geometric analysis of a quartic-quadratic optimization problem under a spherical constraint ⋮ On the Convergence of Mirror Descent beyond Stochastic Convex Programming ⋮ Unnamed Item ⋮ Asymptotic Properties of Stationary Solutions of Coupled Nonconvex Nonsmooth Empirical Risk Minimization ⋮ Randomly initialized EM algorithm for two-component Gaussian mixture achieves near optimality in \(O(\sqrt{n})\) iterations
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Optimal rates of convergence for noisy sparse phase retrieval via thresholded Wirtinger flow
- Low rank matrix recovery from rank one measurements
- Stable optimizationless recovery from phaseless linear measurements
- Solving quadratic equations via phaselift when there are about as many equations as unknowns
- On the impact of predictor geometry on the performance on high-dimensional ridge-regularized generalized robust regression estimators
- Phase retrieval with one or two diffraction patterns by alternating projections with the null initialization
- Fast rank-one alternating minimization algorithm for phase retrieval
- Implicit regularization in nonconvex statistical estimation: gradient descent converges linearly for phase retrieval, matrix completion, and blind deconvolution
- Entrywise eigenvector analysis of random matrices with low expected rank
- Saving phase: injectivity and stability for phase retrieval
- The likelihood ratio test in high-dimensional logistic regression is asymptotically a rescaled Chi-square
- Spectral method and regularized MLE are both optimal for top-\(K\) ranking
- Rapid, robust, and reliable blind deconvolution via nonconvex optimization
- ROP: matrix recovery via rank-one projections
- Fundamental limits of weak recovery with applications to phase retrieval
- Misspecified nonconvex statistical optimization for sparse phase retrieval
- Cubic regularization of Newton method and its global performance
- Asymmetry helps: eigenvalue and eigenvector analyses of asymmetrically perturbed low-rank matrices
- PhaseLift: Exact and Stable Signal Recovery from Magnitude Measurements via Convex Programming
- On robust regression with high-dimensional predictors
- Guaranteed Matrix Completion via Non-Convex Factorization
- Exact and Stable Covariance Estimation From Quadratic Sampling via Convex Programming
- Phase Retrieval via Wirtinger Flow: Theory and Algorithms
- Solving Random Quadratic Systems of Equations Is Nearly as Easy as Solving Linear Systems
- Solving systems of phaseless equations via Kaczmarz methods: a proof of concept study
- Non-Convex Phase Retrieval From STFT Measurements
- Solving Systems of Random Quadratic Equations via Truncated Amplitude Flow
- Phase Retrieval Using Alternating Minimization
- The Projected Power Method: An Efficient Algorithm for Joint Alignment from Pairwise Differences
- Theoretical Insights Into the Optimization Landscape of Over-Parameterized Shallow Neural Networks
- Compressive Phase Retrieval via Reweighted Amplitude Flow
- Near-Optimal Bounds for Phase Synchronization
- Nonconvex Matrix Factorization From Rank-One Measurements
- Solving (most) of a set of quadratic equalities: composite optimization for robust phase retrieval
- Convolutional Phase Retrieval via Gradient Descent
- Sparse and Low-Rank Tensor Estimation via Cubic Sketchings
- Revisiting Normalized Gradient Descent: Fast Evasion of Saddle Points
- Structured Signal Recovery From Quadratic Measurements: Breaking Sample Complexity Barriers via Nonconvex Optimization
- Optimization-Based AMP for Phase Retrieval: The Impact of Initialization and $\ell_{2}$ Regularization
- Blind Deconvolution by a Steepest Descent Algorithm on a Quotient Manifold
- Nonconvex Optimization Meets Low-Rank Matrix Factorization: An Overview
- Phase retrieval via randomized Kaczmarz: theoretical guarantees
- Matrix Completion From a Few Entries
- Concentration and Moment Inequalities for Polynomials of Independent Random Variables
- Phase Retrieval via Matrix Completion
This page was built for publication: Gradient descent with random initialization: fast global convergence for nonconvex phase retrieval