Sparse and smooth signal estimation: convexification of _0-formulations
From MaRDI portal
Publication:4998944
Authors: Alper Atamtürk, Andrés Gómez, Shaoning Han
Publication date: 9 July 2021
Full work available at URL: https://arxiv.org/abs/1811.02655
Recommendations
- A smoothing method for sparse optimization over convex sets
- Complementarity formulations of \(\ell_0\)-norm optimization
- A convex relaxation framework consisting of a primal-dual alternative algorithm for solving \(\ell_0\) sparsity-induced optimization problems with application to signal recovery based image restoration
- Relaxation approaches for nonlinear sparse optimization problems
- Sparse approximate reconstruction decomposed by two optimization problems
Cites Work
- The elements of statistical learning. Data mining, inference, and prediction
- Nearly unbiased variable selection under minimax concave penalty
- Exact spike train inference via \(\ell_{0}\) optimization
- Extended formulations in mixed integer conic quadratic programming
- Simplex QP-based methods for minimizing a conic quadratic objective over polyhedra
- SparseNet: coordinate descent with nonconvex penalties
- Nonlinear total variation based noise removal algorithms
- Fast best subset selection: coordinate descent and local combinatorial optimization algorithms
- The Adaptive Lasso and Its Oracle Properties
- Title not available (Why is that?)
- Adaptive piecewise polynomial estimation via trend filtering
- Best subset selection via a modern optimization lens
- The DFS fused Lasso: linear-time denoising over general graphs
- Regression Shrinkage and Selection via The Lasso: A Retrospective
- Sparsity and Smoothness Via the Fused Lasso
- Regularization and Variable Selection Via the Elastic Net
- The solution path of the generalized lasso
- Enhancing sparsity by reweighted \(\ell _{1}\) minimization
- Properties and refinements of the fused Lasso
- Locally adaptive regression splines
- Adaptive Lasso for sparse high-dimensional regression models
- Title not available (Why is that?)
- Identifying a minimal class of models for high-dimensional data
- Stable recovery of sparse overcomplete representations in the presence of noise
- A Statistical View of Some Chemometrics Regression Tools
- Title not available (Why is that?)
- Consistency of the group Lasso and multiple kernel learning
- $\ell_1$ Trend Filtering
- Compressed sensing
- Title not available (Why is that?)
- Applications of second-order cone programming
- M-matrix characterizations. I: nonsingular M-matrices
- Second-order cone programming
- A polyhedral branch-and-cut approach to global optimization
- Perspective cuts for a class of convex 0-1 mixed integer programs
- Title not available (Why is that?)
- A strong conic quadratic reformulation for machine-job assignment with controllable processing times
- On recurring theorems on diagonal dominance
- On factor width and symmetric \(H\)-matrices
- Atomic decomposition by basis pursuit
- On constrained and regularized high-dimensional regression
- Iterative Methods for Total Variation Denoising
- On mathematical programming with indicator constraints
- Quadratic convex reformulations for semicontinuous quadratic programming
- Conic mixed-integer rounding cuts
- Strong formulations of robust mixed 0-1 programming
- Approximation algorithms for classification problems with pairwise relationships, metric labeling and Markov random fields
- Mixed-integer nonlinear programs featuring ``on/off constraints
- Perspective reformulations of mixed integer nonlinear programs with indicator variables
- Structural properties of affine sparsity constraints
- Title not available (Why is that?)
- Criteria for generalized diagonally dominant matrices and \(M\)-matrices
- Structured sparsity via alternating direction methods
- Interior-point methods for optimization
- High Dimensional Thresholded Regression and Shrinkage Effect
- Cuts for Conic Mixed-Integer Programming
- The smooth-Lasso and other \(\ell _{1}+\ell _{2}\)-penalized methods
- A cut-based algorithm for the nonlinear dual of the minimum cost network flow problem
- Sparse high-dimensional regression: exact scalable algorithms and phase transitions
- An efficient algorithm for image segmentation, Markov random fields and related problems
- Strong formulations for quadratic optimization with M-matrices and indicator variables
- Scalable algorithms for the sparse ridge regression
- OR forum: An algorithmic approach to linear regression
- Quadratic cone cutting surfaces for quadratic programs with on-off constraints
- Multi-label Markov random fields as an efficient and effective tool for image segmentation, total variations and regularization
- Title not available (Why is that?)
- Rejoinder: ``Best subset, forward stepwise or Lasso? Analysis and recommendations based on extensive comparisons
- Sparse learning via Boolean relaxations
- Submodularity in Conic Quadratic Mixed 0–1 Optimization
- On integer and MPCC representability of affine sparsity
- Minimization of Akaike's information criterion in linear regression analysis via mixed integer nonlinear program
- Adjacency-clustering and its application for yield prediction in integrated circuit manufacturing
Cited In (18)
- Group sparse structural smoothing recovery: model, statistical properties and algorithm
- Cardinality minimization, constraints, and regularization: a survey
- Grouped variable selection with discrete optimization: computational and statistical perspectives
- \(2 \times 2\)-convexifications for convex quadratic optimization with indicator variables
- A graph-based decomposition method for convex quadratic optimization with indicators
- An Improved Smoothed $\ell^0$ Approximation Algorithm for Sparse Representation
- Outlier detection in time series via mixed-integer conic quadratic optimization
- On the \(\ell_1\)-norm invariant convex \(k\)-sparse decomposition of signals
- A Unitarily Constrained Total Least Squares Problem in Signal Processing
- Constrained optimization of rank-one functions with indicator variables
- The equivalence of optimal perspective formulation and Shor's SDP for quadratic programs with indicator variables
- Supermodularity and valid inequalities for quadratic optimization with indicators
- On the convexification of constrained quadratic optimization problems with indicator variables
- BranchHull: convex bilinear inversion from the entrywise product of signals with known signs
- Ideal formulations for constrained convex optimization problems with indicator variables
- Linear-step solvability of some folded concave and singly-parametric sparse optimization problems
- Comparing solution paths of sparse quadratic minimization with a Stieltjes matrix
- On the convex hull of convex quadratic optimization problems with indicators
Uses Software
This page was built for publication: Sparse and smooth signal estimation: convexification of \(\ell_0\)-formulations
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q4998944)