The Dynamics of Message Passing on Dense Graphs, with Applications to Compressed Sensing
From MaRDI portal
Publication:5281076
Abstract: Approximate message passing algorithms proved to be extremely effective in reconstructing sparse signals from a small number of incoherent linear measurements. Extensive numerical experiments further showed that their dynamics is accurately tracked by a simple one-dimensional iteration termed state evolution. In this paper we provide the first rigorous foundation to state evolution. We prove that indeed it holds asymptotically in the large system limit for sensing matrices with independent and identically distributed gaussian entries. While our focus is on message passing algorithms for compressed sensing, the analysis extends beyond this setting, to a general class of algorithms on dense graphs. In this context, state evolution plays the role that density evolution has for sparse graphs. The proof technique is fundamentally different from the standard approach to density evolution, in that it copes with large number of short loops in the underlying factor graph. It relies instead on a conditioning technique recently developed by Erwin Bolthausen in the context of spin glass theory.
Cited in
(only showing first 100 items - show all)- Accelerating cross-validation in multinomial logistic regression with \(\ell_1\)-regularization
- Data assimilation -- mathematical foundation and applications. Abstracts from the workshop held February 20--26, 2022
- Semi-analytic resampling in Lasso
- Finding hidden cliques of size \(\sqrt{N/e}\) in nearly linear time
- Generalized approximate survey propagation for high-dimensional estimation *
- Critical behavior and universality classes for an algorithmic phase transition in sparse reconstruction
- Notes on computational-to-statistical gaps: predictions using statistical physics
- scientific article; zbMATH DE number 7750674 (Why is no real title available?)
- Approximate message passing with spectral initialization for generalized linear models*
- Constrained low-rank matrix estimation: phase transitions, approximate message passing and applications
- Asymptotic mutual information for the balanced binary stochastic block model
- LASSO risk and phase transition under dependence
- Unbiasing in iterative reconstruction algorithms for discrete compressed sensing
- Statistical limits of spiked tensor models
- Matrix inference and estimation in multi-layer models*
- Overcoming the limitations of phase transition by higher order analysis of regularization techniques
- A power analysis for Model-X knockoffs with \(\ell_p\)-regularized statistics
- Robust subspace clustering
- Algorithmic obstructions in the random number partitioning problem
- Asymptotic risk and phase transition of \(l_1\)-penalized robust estimator
- Community detection and stochastic block models: recent developments
- Phase transition in the spiked random tensor with Rademacher prior
- Optimal low-degree hardness of maximum independent set
- Perturbative construction of mean-field equations in extensive-rank matrix factorization and denoising
- A Unifying Tutorial on Approximate Message Passing
- Local algorithms for maximum cut and minimum bisection on locally treelike regular graphs of large degree
- Statistical mechanics approach to 1-bit compressed sensing
- Approximate message passing algorithms for rotationally invariant matrices
- Semi-analytic approximate stability selection for correlated data in generalized linear models
- Plug in estimation in high dimensional linear inverse problems a rigorous analysis
- Debiasing the Lasso: optimal sample size for Gaussian designs
- The likelihood ratio test in high-dimensional logistic regression is asymptotically a rescaled Chi-square
- Regularization by denoising via fixed-point projection (RED-PRO)
- Infinite-width limit of deep linear neural networks
- Analysis of Bayesian inference algorithms by the dynamical functional approach
- Fundamental limits of weak recovery with applications to phase retrieval
- High dimensional robust M-estimation: asymptotic variance via approximate message passing
- Approximate message-passing with spatially coupled structured operators, with applications to compressed sensing and sparse superposition codes
- Statistical mechanics analysis of generalized multi-dimensional knapsack problems
- A message-passing approach to phase retrieval of sparse signals
- Thouless-Anderson-Palmer equations for the multi-species Sherrington-Kirkpatrick model
- Characterizing the SLOPE trade-off: a variational perspective and the Donoho-Tanner limit
- Optimization of the Sherrington--Kirkpatrick Hamiltonian
- Recovering structured signals in noise: least-squares meets compressed sensing
- Computational barriers to estimation from low-degree polynomials
- Fluctuations, bias, variance and ensemble of learners: exact asymptotics for convex losses in high-dimension
- Multi-layer state evolution under random convolutional design
- The replica-symmetric free energy for Ising spin glasses with orthogonally invariant couplings
- Which bridge estimator is the best for variable selection?
- Universality of regularized regression estimators in high dimensions
- Noisy linear inverse problems under convex constraints: exact risk asymptotics in high dimensions
- Automatic bias correction for testing in high‐dimensional linear models
- Fast and reliable parameter estimation from nonlinear observations
- Universality of approximate message passing algorithms and tensor networks
- The existence of maximum likelihood estimate in high-dimensional binary response generalized linear models
- Universality of approximate message passing algorithms
- Learning low-dimensional nonlinear structures from high-dimensional noisy data: an integral operator approach
- Perfect reconstruction of sparse signals with piecewise continuous nonconvex penalties and nonconvexity control
- Message-Passing De-Quantization With Applications to Compressed Sensing
- Fundamental limits of low-rank matrix estimation with diverging aspect ratios
- Weighted message passing and minimum energy flow for heterogeneous stochastic block models with side information
- Sharp MSE bounds for proximal denoising
- Local convexity of the TAP free energy and AMP convergence for \(\mathbb{Z}_2\)-synchronization
- The phase transition of matrix recovery from Gaussian measurements matches the minimax MSE of matrix denoising
- Statistical mechanics of low-rank tensor decomposition
- Phase transition in random tensors with multiple independent spikes
- The scaling limit of high-dimensional online independent component analysis
- Approximate message passing with rigorous guarantees for pooled data and quantitative group testing
- The distribution of the Lasso: uniform control over sparse balls and adaptive parameter tuning
- The overlap gap property and approximate message passing algorithms for \(p\)-spin models
- A dynamical mean-field theory for learning in restricted Boltzmann machines
- Optimization of mean-field spin glasses
- A tradeoff between false discovery and true positive proportions for sparse high-dimensional logistic regression
- On the TAP equations via the cavity approach in the generic mixed \(p\)-spin models
- An iterative construction of solutions of the TAP equations for the Sherrington-Kirkpatrick model
- Tight Lipschitz hardness for optimizing mean field spin glasses
- Equilibria of large random Lotka-Volterra systems with vanishing species: a mathematical approach
- Algorithmic pure states for the negative spherical perceptron
- On the universality of noiseless linear estimation with respect to the measurement matrix
- Bayesian Imaging Using Plug & Play Priors: When Langevin Meets Tweedie
- Large scale stochastic dynamics. Abstracts from the workshop held September 11--17, 2022
- Submatrix localization via message passing
- Analysis of random sequential message passing algorithms for approximate inference
- The committee machine: computational to statistical gaps in learning a two-layers neural network
- TAP free energy, spin glasses and variational inference
- Disordered systems insights on computational hardness
- Dense limit of the Dawid–Skene model for crowdsourcing and regions of sub-optimality of message passing algorithms
- scientific article; zbMATH DE number 7625169 (Why is no real title available?)
- Detangling robustness in high dimensions: composite versus model-averaged estimation
- Approximate survey propagation for statistical inference
- Blind sensor calibration using approximate message passing
- Compressive Computed Tomography Reconstruction through Denoising Approximate Message Passing
- Replica analysis of overfitting in generalized linear regression models
- Consistent parameter estimation for Lasso and approximate message passing
- Optimal combination of linear and spectral estimators for generalized linear models
- Estimation of low-rank matrices via approximate message passing
- Approximate message passing for sparse matrices with application to the equilibria of large ecological Lotka-Volterra systems
- Optimization algorithms for multi-species spherical spin glasses
- Approximate message passing for nonconvex sparse regularization with stability and asymptotic analysis
- Optimality and sub-optimality of PCA. I: Spiked random matrix models
This page was built for publication: The Dynamics of Message Passing on Dense Graphs, with Applications to Compressed Sensing
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q5281076)