Implicit regularization in nonconvex statistical estimation: gradient descent converges linearly for phase retrieval, matrix completion, and blind deconvolution
DOI10.1007/S10208-019-09429-9zbMATH Open1445.90089arXiv1711.10467OpenAlexW2768927505WikidataQ127406705 ScholiaQ127406705MaRDI QIDQ2189396FDOQ2189396
Yuejie Chi, Yuxin Chen, Cong Ma, Kaizheng Wang
Publication date: 15 June 2020
Published in: Foundations of Computational Mathematics (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1711.10467
Recommendations
- Gradient descent with random initialization: fast global convergence for nonconvex phase retrieval
- Gradient descent with non-convex constraints: local concavity determines convergence
- Nonconvex Low-Rank Tensor Completion from Noisy Data
- The nonsmooth landscape of phase retrieval
- Fast global convergence of gradient methods for high-dimensional statistical recovery
matrix completionnonconvex optimizationblind deconvolutionphase retrievalgradient descentleave-one-out analysis
Cites Work
- Matrix completion from noisy entries
- Matrix completion and low-rank SVD via fast alternating least squares
- Phase Retrieval via Wirtinger Flow: Theory and Algorithms
- PhaseMax: Convex Phase Retrieval via Basis Pursuit
- GESPAR: Efficient Phase Retrieval of Sparse Signals
- Fast, robust and non-convex subspace recovery
- ADMiRA: Atomic Decomposition for Minimum Rank Approximation
- Phase retrieval via matrix completion
- The Rotation of Eigenvectors by a Perturbation. III
- Robust principal component analysis?
- A useful variant of the Davis–Kahan theorem for statisticians
- Hanson-Wright inequality and sub-Gaussian concentration
- 10.1162/153244302760200704
- Exact matrix completion via convex optimization
- Nuclear-norm penalization and optimal rates for noisy low-rank matrix completion
- Guaranteed Minimum-Rank Solutions of Linear Matrix Equations via Nuclear Norm Minimization
- Inference and uncertainty quantification for noisy matrix completion
- The Power of Convex Relaxation: Near-Optimal Matrix Completion
- Restricted strong convexity and weighted matrix completion: Optimal bounds with noise
- Oracle inequalities in empirical risk minimization and sparse recovery problems. École d'Été de Probabilités de Saint-Flour XXXVIII-2008.
- A tail inequality for quadratic forms of subgaussian random vectors
- Rank-Sparsity Incoherence for Matrix Decomposition
- Title not available (Why is that?)
- An Introduction to Matrix Concentration Inequalities
- Perturbation bounds in connection with singular value decomposition
- A Probabilistic and RIPless Theory of Compressed Sensing
- Matrix Completion From a Few Entries
- Orthogonal Procrustes rotation for two or more matrices
- The probabilistic method. With an appendix on the life and work of Paul Erdős.
- Perturbation Bounds for the Polar Decomposition
- Estimating the matrix \(p\)-norm
- ROP: matrix recovery via rank-one projections
- Title not available (Why is that?)
- Recovering Low-Rank Matrices From Few Coefficients in Any Basis
- A Simpler Approach to Matrix Completion
- Low-rank matrix completion using alternating minimization
- A note on \(\sin\Theta\) theorems for singular subspace variations
- Solving quadratic equations via phaselift when there are about as many equations as unknowns
- The spectral norm of a nonnegative matrix
- Convex optimization: algorithms and complexity
- On robust regression with high-dimensional predictors
- Optimal rates of convergence for noisy sparse phase retrieval via thresholded Wirtinger flow
- Convex Recovery of a Structured Signal from Independent Random Linear Measurements
- Exact and Stable Covariance Estimation From Quadratic Sampling via Convex Programming
- Incoherence-Optimal Matrix Completion
- Blind Deconvolution Using Convex Programming
- Compressive Phase Retrieval via Generalized Approximate Message Passing
- Rapid, robust, and reliable blind deconvolution via nonconvex optimization
- Concentration and Moment Inequalities for Polynomials of Independent Random Variables
- Guaranteed Matrix Completion via Non-Convex Factorization
- Complete Dictionary Recovery Over the Sphere I: Overview and the Geometric Picture
- Symmetry, Saddle Points, and Global Optimization Landscape of Nonconvex Matrix Factorization
- An elementary proof of convex phase retrieval in the natural parameter space via the linear program PhaseMax
- Self-calibration and biconvex compressive sensing
- Solving Systems of Random Quadratic Equations via Truncated Amplitude Flow
- Structured Signal Recovery From Quadratic Measurements: Breaking Sample Complexity Barriers via Nonconvex Optimization
- The Projected Power Method: An Efficient Algorithm for Joint Alignment from Pairwise Differences
- Nonconvex Optimization Meets Low-Rank Matrix Factorization: An Overview
- Low rank matrix completion by alternating steepest descent methods
- Guarantees of Riemannian optimization for low rank matrix recovery
- Perturbation bounds for matrix square roots and Pythagorean sums
- Debiasing the Lasso: optimal sample size for Gaussian designs
- Solving Random Quadratic Systems of Equations Is Nearly as Easy as Solving Linear Systems
- Phase retrieval with one or two diffraction patterns by alternating projections with the null initialization
- A flexible convex relaxation for phase retrieval
- Fast rank-one alternating minimization algorithm for phase retrieval
- Solving systems of phaseless equations via Kaczmarz methods: a proof of concept study
- Phase Retrieval Using Alternating Minimization
- The local convexity of solving systems of quadratic equations
- A Well-Tempered Landscape for Non-convex Robust Subspace Recovery
- The landscape of empirical risk for nonconvex losses
- Regularized gradient descent: a non-convex recipe for fast joint blind deconvolution and demixing
- Theoretical Insights Into the Optimization Landscape of Over-Parameterized Shallow Neural Networks
- Blind Recovery of Sparse Signals From Subsampled Convolution
- Non-Convex Phase Retrieval From STFT Measurements
- Title not available (Why is that?)
- Gradient descent with random initialization: fast global convergence for nonconvex phase retrieval
- Sparse Phase Retrieval via Truncated Amplitude Flow
- The likelihood ratio test in high-dimensional logistic regression is asymptotically a rescaled Chi-square
- Fundamental limits of weak recovery with applications to phase retrieval
- Title not available (Why is that?)
- Near-Optimal Bounds for Phase Synchronization
- Phase retrieval via randomized Kaczmarz: theoretical guarantees
- Spectral method and regularized MLE are both optimal for top-\(K\) ranking
- Fast and Guaranteed Blind Multichannel Deconvolution Under a Bilinear System Model
- Blind Deconvolution by a Steepest Descent Algorithm on a Quotient Manifold
Cited In (37)
- GNMR: A Provable One-Line Algorithm for Low Rank Matrix Recovery
- Approximate message passing with spectral initialization for generalized linear models*
- Title not available (Why is that?)
- Improved Performance Guarantees for Orthogonal Group Synchronization via Generalized Power Method
- The numerics of phase retrieval
- Title not available (Why is that?)
- Gradient descent with random initialization: fast global convergence for nonconvex phase retrieval
- Low-rank matrix recovery under heavy-tailed errors
- Title not available (Why is that?)
- First-Order Methods for Nonconvex Quadratic Minimization
- Smoothed amplitude flow-based phase retrieval algorithm
- An Equivalence between Critical Points for Rank Constraints Versus Low-Rank Factorizations
- A diffusion process perspective on posterior contraction rates for parameters
- Title not available (Why is that?)
- Median-Truncated Gradient Descent: A Robust and Scalable Nonconvex Approach for Signal Estimation
- Entrywise eigenvector analysis of random matrices with low expected rank
- Exact Recovery of Multichannel Sparse Blind Deconvolution via Gradient Descent
- Normal approximation and confidence region of singular subspaces
- Non-convex exact community recovery in stochastic block model
- Near-optimal bounds for generalized orthogonal Procrustes problem via generalized power method
- Byzantine-robust distributed sparse learning for \(M\)-estimation
- Near-optimal performance bounds for orthogonal and permutation group synchronization via spectral methods
- Subspace estimation from unbalanced and incomplete data matrices: \({\ell_{2,\infty}}\) statistical guarantees
- Lower bounds for non-convex stochastic optimization
- Inference for low-rank completion without sample splitting with application to treatment effect estimation
- Estimation from nonlinear observations via convex programming with application to bilinear regression
- Spectral method and regularized MLE are both optimal for top-\(K\) ranking
- Nonconvex Low-Rank Tensor Completion from Noisy Data
- Bridging convex and nonconvex optimization in robust PCA: noise, outliers and missing data
- Asymmetry helps: eigenvalue and eigenvector analyses of asymmetrically perturbed low-rank matrices
- Matrix completion with nonconvex regularization: spectral operators and scalable algorithms
- Truncated amplitude flow with coded diffraction patterns
- Adaptive iterative hard thresholding for low-rank matrix recovery and rank-one measurements
- Robust recovery of low-rank matrices with non-orthogonal sparse decomposition from incomplete measurements
- Noisy Matrix Completion: Understanding Statistical Guarantees for Convex Relaxation via Nonconvex Optimization
- Inference for heteroskedastic PCA with missing data
- Proof methods for robust low-rank matrix recovery
Uses Software
This page was built for publication: Implicit regularization in nonconvex statistical estimation: gradient descent converges linearly for phase retrieval, matrix completion, and blind deconvolution
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2189396)