Sparse signal reconstruction via recurrent neural networks with hyperbolic tangent function
DOI10.1016/j.neunet.2022.05.022zbMath1522.94017OpenAlexW4281728993WikidataQ113868235 ScholiaQ113868235MaRDI QIDQ6077029
Hongsong Wen, Xing He, Tingwen Huang
Publication date: 17 October 2023
Published in: Neural Networks (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/j.neunet.2022.05.022
hyperbolic tangent functionfinite-time convergencesliding mode control technique\(L_1\)-minimizationfinite-time RNN (FTRNN)
Artificial neural networks and deep learning (68T07) Signal theory (characterization, reconstruction, filtering, etc.) (94A12) Image processing (compression, reconstruction, etc.) in information and communication theory (94A08)
Related Items (3)
Cites Work
- Unnamed Item
- Unnamed Item
- The restricted isometry property and its implications for compressed sensing
- The Dantzig selector: statistical estimation when \(p\) is much larger than \(n\). (With discussions and rejoinder).
- Robust uncertainty principles: exact signal reconstruction from highly incomplete frequency information
- From Sparse Solutions of Systems of Equations to Sparse Modeling of Signals and Images
- A Novel Accelerometer-Based Gesture Recognition System
- Dynamical Sparse Recovery With Finite-Time Convergence
- An Interior Point Algorithm for Large-Scale Nonlinear Programming
- Analysis of Sparse Representation and Blind Source Separation
- Sparse Approximate Solutions to Linear Systems
- A sparse signal reconstruction perspective for source localization with sensor arrays
- An Augmented Lagrangian Approach to the Constrained Optimization Formulation of Imaging Inverse Problems
- A neurodynamic algorithm for sparse signal reconstruction with finite-time convergence
This page was built for publication: Sparse signal reconstruction via recurrent neural networks with hyperbolic tangent function