Implicit regularization in nonconvex statistical estimation: gradient descent converges linearly for phase retrieval, matrix completion, and blind deconvolution
From MaRDI portal
Publication:2189396
DOI10.1007/S10208-019-09429-9zbMath1445.90089arXiv1711.10467OpenAlexW2768927505WikidataQ127406705 ScholiaQ127406705MaRDI QIDQ2189396
Yuejie Chi, Yuxin Chen, Cong Ma, Kaizheng Wang
Publication date: 15 June 2020
Published in: Foundations of Computational Mathematics (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1711.10467
matrix completionnonconvex optimizationblind deconvolutiongradient descentphase retrievalleave-one-out analysis
Related Items (33)
The numerics of phase retrieval ⋮ Approximate message passing with spectral initialization for generalized linear models* ⋮ Near-optimal performance bounds for orthogonal and permutation group synchronization via spectral methods ⋮ Asymmetry helps: eigenvalue and eigenvector analyses of asymmetrically perturbed low-rank matrices ⋮ Nonconvex Low-Rank Tensor Completion from Noisy Data ⋮ Improved Performance Guarantees for Orthogonal Group Synchronization via Generalized Power Method ⋮ Robust recovery of low-rank matrices with non-orthogonal sparse decomposition from incomplete measurements ⋮ GNMR: A Provable One-Line Algorithm for Low Rank Matrix Recovery ⋮ Lower bounds for non-convex stochastic optimization ⋮ Byzantine-robust distributed sparse learning for \(M\)-estimation ⋮ Unnamed Item ⋮ Gradient descent with random initialization: fast global convergence for nonconvex phase retrieval ⋮ Matrix completion with nonconvex regularization: spectral operators and scalable algorithms ⋮ Entrywise eigenvector analysis of random matrices with low expected rank ⋮ Near-optimal bounds for generalized orthogonal Procrustes problem via generalized power method ⋮ First-Order Methods for Nonconvex Quadratic Minimization ⋮ Inference for low-rank completion without sample splitting with application to treatment effect estimation ⋮ Adaptive iterative hard thresholding for low-rank matrix recovery and rank-one measurements ⋮ An Equivalence between Critical Points for Rank Constraints Versus Low-Rank Factorizations ⋮ Noisy Matrix Completion: Understanding Statistical Guarantees for Convex Relaxation via Nonconvex Optimization ⋮ Median-Truncated Gradient Descent: A Robust and Scalable Nonconvex Approach for Signal Estimation ⋮ Exact Recovery of Multichannel Sparse Blind Deconvolution via Gradient Descent ⋮ Smoothed amplitude flow-based phase retrieval algorithm ⋮ Normal approximation and confidence region of singular subspaces ⋮ Estimation from nonlinear observations via convex programming with application to bilinear regression ⋮ Unnamed Item ⋮ Subspace estimation from unbalanced and incomplete data matrices: \({\ell_{2,\infty}}\) statistical guarantees ⋮ Bridging convex and nonconvex optimization in robust PCA: noise, outliers and missing data ⋮ Spectral method and regularized MLE are both optimal for top-\(K\) ranking ⋮ Unnamed Item ⋮ Non-convex exact community recovery in stochastic block model ⋮ Unnamed Item ⋮ Proof methods for robust low-rank matrix recovery
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Matrix Completion and Low-Rank SVD via Fast Alternating Least Squares
- Optimal rates of convergence for noisy sparse phase retrieval via thresholded Wirtinger flow
- Solving quadratic equations via phaselift when there are about as many equations as unknowns
- Oracle inequalities in empirical risk minimization and sparse recovery problems. École d'Été de Probabilités de Saint-Flour XXXVIII-2008.
- Nuclear-norm penalization and optimal rates for noisy low-rank matrix completion
- An elementary proof of convex phase retrieval in the natural parameter space via the linear program PhaseMax
- A tail inequality for quadratic forms of subgaussian random vectors
- Hanson-Wright inequality and sub-Gaussian concentration
- Estimating the matrix \(p\)-norm
- Low rank matrix completion by alternating steepest descent methods
- The spectral norm of a nonnegative matrix
- Perturbation bounds for matrix square roots and Pythagorean sums
- Orthogonal Procrustes rotation for two or more matrices
- A note on \(\sin\Theta\) theorems for singular subspace variations
- Phase retrieval with one or two diffraction patterns by alternating projections with the null initialization
- A flexible convex relaxation for phase retrieval
- Debiasing the Lasso: optimal sample size for Gaussian designs
- The landscape of empirical risk for nonconvex losses
- Fast rank-one alternating minimization algorithm for phase retrieval
- The likelihood ratio test in high-dimensional logistic regression is asymptotically a rescaled Chi-square
- Spectral method and regularized MLE are both optimal for top-\(K\) ranking
- Rapid, robust, and reliable blind deconvolution via nonconvex optimization
- ROP: matrix recovery via rank-one projections
- The local convexity of solving systems of quadratic equations
- Fundamental limits of weak recovery with applications to phase retrieval
- Gradient descent with random initialization: fast global convergence for nonconvex phase retrieval
- Exact matrix completion via convex optimization
- Convex Recovery of a Structured Signal from Independent Random Linear Measurements
- Guarantees of Riemannian Optimization for Low Rank Matrix Recovery
- On robust regression with high-dimensional predictors
- Guaranteed Matrix Completion via Non-Convex Factorization
- Exact and Stable Covariance Estimation From Quadratic Sampling via Convex Programming
- Incoherence-Optimal Matrix Completion
- Phase Retrieval via Wirtinger Flow: Theory and Algorithms
- Solving Random Quadratic Systems of Equations Is Nearly as Easy as Solving Linear Systems
- Blind Deconvolution Using Convex Programming
- Blind Recovery of Sparse Signals From Subsampled Convolution
- Complete Dictionary Recovery Over the Sphere I: Overview and the Geometric Picture
- Robust principal component analysis?
- Rank-Sparsity Incoherence for Matrix Decomposition
- Self-calibration and biconvex compressive sensing
- Solving systems of phaseless equations via Kaczmarz methods: a proof of concept study
- Guaranteed Minimum-Rank Solutions of Linear Matrix Equations via Nuclear Norm Minimization
- Non-Convex Phase Retrieval From STFT Measurements
- Solving Systems of Random Quadratic Equations via Truncated Amplitude Flow
- PhaseMax: Convex Phase Retrieval via Basis Pursuit
- GESPAR: Efficient Phase Retrieval of Sparse Signals
- Compressive Phase Retrieval via Generalized Approximate Message Passing
- Phase Retrieval Using Alternating Minimization
- The Projected Power Method: An Efficient Algorithm for Joint Alignment from Pairwise Differences
- Theoretical Insights Into the Optimization Landscape of Over-Parameterized Shallow Neural Networks
- Sparse Phase Retrieval via Truncated Amplitude Flow
- A Well-Tempered Landscape for Non-convex Robust Subspace Recovery
- Near-Optimal Bounds for Phase Synchronization
- Fast and Guaranteed Blind Multichannel Deconvolution Under a Bilinear System Model
- 10.1162/153244302760200704
- Inference and uncertainty quantification for noisy matrix completion
- Structured Signal Recovery From Quadratic Measurements: Breaking Sample Complexity Barriers via Nonconvex Optimization
- Symmetry, Saddle Points, and Global Optimization Landscape of Nonconvex Matrix Factorization
- Blind Deconvolution by a Steepest Descent Algorithm on a Quotient Manifold
- Nonconvex Optimization Meets Low-Rank Matrix Factorization: An Overview
- Fast, robust and non-convex subspace recovery
- Regularized gradient descent: a non-convex recipe for fast joint blind deconvolution and demixing
- Phase retrieval via randomized Kaczmarz: theoretical guarantees
- A useful variant of the Davis–Kahan theorem for statisticians
- A Probabilistic and RIPless Theory of Compressed Sensing
- Recovering Low-Rank Matrices From Few Coefficients in Any Basis
- ADMiRA: Atomic Decomposition for Minimum Rank Approximation
- Matrix Completion From a Few Entries
- The Power of Convex Relaxation: Near-Optimal Matrix Completion
- Perturbation Bounds for the Polar Decomposition
- A Simpler Approach to Matrix Completion
- Restricted strong convexity and weighted matrix completion: Optimal bounds with noise
- Low-rank matrix completion using alternating minimization
- An Introduction to Matrix Concentration Inequalities
- The Rotation of Eigenvectors by a Perturbation. III
- Perturbation bounds in connection with singular value decomposition
- Concentration and Moment Inequalities for Polynomials of Independent Random Variables
- Phase Retrieval via Matrix Completion
This page was built for publication: Implicit regularization in nonconvex statistical estimation: gradient descent converges linearly for phase retrieval, matrix completion, and blind deconvolution