A Common Network Architecture Efficiently Implements a Variety of Sparsity-Based Inference Problems
From MaRDI portal
Publication:2840897
DOI10.1162/NECO_a_00372zbMath1268.92001OpenAlexW2107306109WikidataQ40062426 ScholiaQ40062426MaRDI QIDQ2840897
Pierre Garrigues, Christopher J. Rozell, Adam S. Charles
Publication date: 23 July 2013
Published in: Neural Computation (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1162/neco_a_00372
Lua error in Module:PublicationMSCList at line 37: attempt to index local 'msc_result' (a nil value).
Cites Work
- Unnamed Item
- The Adaptive Lasso and Its Oracle Properties
- Enhancing sparsity by reweighted \(\ell _{1}\) minimization
- Internal representations for associative memory
- Robust regression: Asymptotics, conjectures and Monte Carlo
- Coordinate and subspace optimization methods for linear least squares with non-quadratic regularization
- Wavelet-based image estimation: an empirical Bayes approach using Jeffrey's noninformative prior
- Regularization of Wavelet Approximations
- Local Strong Homogeneity of a Regularized Estimator
- Block-Sparse Signals: Uncertainty Relations and Efficient Recovery
- Neural networks and physical systems with emergent collective computational abilities.
This page was built for publication: A Common Network Architecture Efficiently Implements a Variety of Sparsity-Based Inference Problems