Distributed continuous-time accelerated neurodynamic approaches for sparse recovery via smooth approximation to L₁-minimization
From MaRDI portal
Publication:6535872
Recommendations
- Neurodynamic approaches for sparse recovery problem with linear inequality constraints
- Smoothing inertial neurodynamic approach for sparse signal reconstruction via \(L_p\)-norm minimization
- A neurodynamic algorithm for sparse signal reconstruction with finite-time convergence
- Fixed-Time Stable Neurodynamic Flow to Sparse Signal Recovery via Nonconvex L1-β2-Norm
- Generalized sparse recovery model and its neural dynamical optimization method for compressed sensing
Cites work
- scientific article; zbMATH DE number 3850830 (Why is no real title available?)
- Accelerated Distributed Nesterov Gradient Descent
- Adaptive Exact Penalty Design for Constrained Distributed Optimization
- Atomic decomposition by basis pursuit
- Decoding by Linear Programming
- Distributed Continuous-Time Algorithm for Constrained Convex Optimizations via Nonsmooth Analysis Approach
- Distributed gradient algorithm for constrained optimization with application to load sharing in power systems
- Dynamical Primal-Dual Nesterov Accelerated Method and Its Application to Network Optimization
- Exponential stability of nonlinear state-dependent delayed impulsive systems with applications
- Fast Distributed Gradient Methods
- Fast and Accurate Algorithms for Re-Weighted $\ell _{1}$-Norm Minimization
- Fast primal-dual algorithm via dynamical system for a linearly constrained convex optimization problem
- Fixed-Time Stable Neurodynamic Flow to Sparse Signal Recovery via Nonconvex L1-β2-Norm
- Input-to-state stability of impulsive reaction-diffusion neural networks with infinite distributed delays
- Iteratively reweighted least squares minimization for sparse recovery
- NESTA: A fast and accurate first-order method for sparse recovery
- Novel projection neurodynamic approaches for constrained convex optimization
- Primal-dual algorithm for distributed constrained optimization
- Projected Nesterov's Proximal-Gradient Algorithm for Sparse Signal Recovery
- Signal Recovery From Random Measurements Via Orthogonal Matching Pursuit
- Smoothing methods for nonsmooth, nonconvex minimization
- Sparse signal reconstruction via recurrent neural networks with hyperbolic tangent function
- Sparse signal recovery by accelerated \(\ell_q\) \((0<q<1)\) thresholding algorithm
Cited in
(2)
This page was built for publication: Distributed continuous-time accelerated neurodynamic approaches for sparse recovery via smooth approximation to \(L_1\)-minimization
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6535872)