PDCO
From MaRDI portal
Software:17288
swMATH5148MaRDI QIDQ17288FDOQ17288
Author name not available (Why is that?)
Cited In (only showing first 100 items - show all)
- An efficient descent method for locally Lipschitz multiobjective optimization problems
- Spectral dynamics and regularization of incompletely and irregularly measured data
- Globally sparse and locally dense signal recovery for compressed sensing
- Extensions of Gauss quadrature via linear programming
- Generalized conjugate gradient methods for \(\ell_1\) regularized convex quadratic programming with finite convergence
- New methods for fitting multiple sinusoids from irregularly sampled data
- Robust sparse recovery via a novel convex model
- Nonparametric denoising of signals of unknown local structure. II: Nonparametric function recovery
- Processing MUSE hyperspectral data: denoising, deconvolution and detection of astrophysical sources
- On the uniqueness of solutions for the basis pursuit in the continuum
- Regularization of geophysical ill-posed problems by iteratively re-weighted and refined least squares
- A short note on compressed sensing with partially known signal support
- Estimation of block sparsity in compressive sensing
- New augmented Lagrangian-based proximal point algorithm for convex optimization with equality constraints
- Enhancing sparsity of Hermite polynomial expansions by iterative rotations
- Positive shrinkage, improved pretest and absolute penalty estimators in partially linear models
- A consistent algorithm to solve Lasso, elastic-net and Tikhonov regularization
- Analysis of basis pursuit via capacity sets
- Compressive sampling and rapid reconstruction of broadband frequency hopping signals with interference
- New cyclic sparsity measures for deconvolution based on convex relaxation
- Sparse recovery via differential inclusions
- On the identifiability of overcomplete dictionaries via the minimisation principle underlying K-SVD
- Limited-memory LDL\(^{\top}\) factorization of symmetric quasi-definite matrices with application to constrained optimization
- Note on the modified relaxation CQ algorithm for the split feasibility problem
- A new convergence analysis and perturbation resilience of some accelerated proximal forward-backward algorithms with errors
- Compression and denoising using \(l _{0}\)-norm
- An efficient privacy-preserving compressive data gathering scheme in WSNs
- Local identifiability of \(\ell_1\)-minimization dictionary learning: a sufficient and almost necessary condition
- Preserving injectivity under subgaussian mappings and its application to compressed sensing
- The \(\ell_{2,q}\) regularized group sparse optimization: lower bound theory, recovery bound and algorithms
- Theoretical guarantees for graph sparse coding
- Generalized ADMM with optimal indefinite proximal term for linearly constrained convex optimization
- Basis adaptive sample efficient polynomial chaos (BASE-PC)
- A proximal alternating linearization method for minimizing the sum of two convex functions
- Feature selection for high-dimensional data
- Minimization of \(L_1\) over \(L_2\) for sparse signal recovery with convergence guarantee
- Fast thresholding algorithms with feedbacks for sparse signal recovery
- Sampling from non-smooth distributions through Langevin diffusion
- Signal recovery under mutual incoherence property and oracle inequalities
- Beyond sparsity: the role of \(L_{1}\)-optimizer in pattern classification
- A sparse recovery method for DOA estimation based on the sample covariance vectors
- Sparse signal reconstruction via the approximations of \(\ell_0\) quasinorm
- Gradient-based method with active set strategy for \(\ell _1\) optimization
- Penalized regression combining the \( L_{1}\) norm and a correlation based penalty
- A simple Gaussian measurement bound for exact recovery of block-sparse signals
- Improved bounds for sparse recovery from subsampled random convolutions
- Bias-variance trade-off for prequential model list selection
- Sparse representation of signals in Hardy space
- Geometric separation in \(\mathbb{R}^3\)
- An iterative support shrinking algorithm for non-Lipschitz optimization in image restoration
- Estimation in high dimensions: a geometric perspective
- Analysis of convergence for the alternating direction method applied to joint sparse recovery
- A pseudo-heuristic parameter selection rule for \(l^1\)-regularized minimization problems
- A dual algorithm for a class of augmented convex signal recovery models
- Block orthogonal greedy algorithm for stable recovery of block-sparse signal representations
- Critical behavior and universality classes for an algorithmic phase transition in sparse reconstruction
- Beyond sparsity: recovering structured representations by \({\ell}^1\) minimization and greedy algorithms
- Spectral Compressed Sensing via Projected Gradient Descent
- Parsimonious additive models
- PROMP: a sparse recovery approach to lattice-valued signals
- Surveying and comparing simultaneous sparse approximation (or group-lasso) algorithms
- Two-dimensional random projection
- Discussion: ``A significance test for the lasso
- Discussion: ``A significance test for the lasso
- Discussion: ``A significance test for the lasso
- Discussion: ``A significance test for the lasso
- Learning semidefinite regularizers
- On the efficiency of the orthogonal matching pursuit in compressed sensing
- Geometric separation by single-pass alternating thresholding
- Necessary and sufficient conditions of solution uniqueness in 1-norm minimization
- A Bayesian lasso via reversible-jump MCMC
- Nonconvex sorted \(\ell_1\) minimization for sparse approximation
- A proximal strictly contractive Peaceman-Rachford splitting method for convex programming with applications to imaging
- A least-squares method for sparse low rank approximation of multivariate functions
- Incrementally updated gradient methods for constrained and regularized optimization
- An adaptive generalized multiscale discontinuous Galerkin method for high-contrast flow problems
- Compressed sensing for finite-valued signals
- An adaptive primal-dual framework for nonsmooth convex minimization
- Sparse identification of nonlinear dynamical systems via reweighted \(\ell_1\)-regularized least squares
- Source localization using a sparse representation framework to achieve superresolution
- An adaptive gradient projection algorithm for piecewise convex optimization and its application in compressed spectrum sensing
- Proximal methods for the latent group lasso penalty
- Discussion: Time-threshold maps: using information from wavelet reconstructions with all threshold values simultaneously
- Rejoinder: Time-threshold maps: using information from wavelet reconstructions with all threshold values simultaneously
- 2DPCA with L1-norm for simultaneously robust and sparse modelling
- An infeasible-point subgradient method using adaptive approximate projections
- Sparsity and persistence: mixed norms provide simple signal models with dependent coefficients
- Fast \(\ell _{1}\) minimization by iterative thresholding for multidimensional NMR spectroscopy
- First-order optimality condition of basis pursuit denoise problem
- Projected Landweber iteration for matrix completion
- Projected shrinkage algorithm for box-constrained \(\ell _1\)-minimization
- Robust face recognition via block sparse Bayesian learning
- Sparse time-frequency representation of nonlinear and nonstationary data
- A sharp nonasymptotic bound and phase diagram of \(L_{1/2}\) regularization
- Deterministic construction of sparse binary matrices via incremental integer optimization
- Adaptive algorithms for sparse system identification
- A variable fixing version of the two-block nonlinear constrained Gauss-Seidel algorithm for \(\ell_1\)-regularized least-squares
- Sparse dual frames and dual Gabor functions of minimal time and frequency supports
- Fast global convergence of gradient methods for high-dimensional statistical recovery
- High-dimensional inference in misspecified linear models
This page was built for software: PDCO