Composite difference-MAX programs for modern statistical estimation problems
DOI10.1137/18M117337XzbMATH Open1407.62250arXiv1803.00205OpenAlexW2963562721WikidataQ128749629 ScholiaQ128749629MaRDI QIDQ4562249FDOQ4562249
Authors: Ying Cui, Jong-Shi Pang, Bodhisattva Sen
Publication date: 19 December 2018
Published in: SIAM Journal on Optimization (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1803.00205
Recommendations
- On the pervasiveness of difference-convexity in optimization and statistics
- A difference of convex optimization algorithm for piecewise linear regression
- MM for penalized estimation
- Difference-of-convex learning: directional stationarity, optimality, and sparsity
- Regularized \(M\)-estimators with nonconvexity: statistical and algorithmic theory for local optima
nonconvex optimizationsemismooth Newton methodnondifferentiable objectivecontinuous piecewise affine regression
General nonlinear regression (62J02) Nonconvex programming, global optimization (90C26) Nonsmooth analysis (49J52)
Cites Work
- SDPNAL+: a majorized semismooth Newton-CG augmented Lagrangian method for semidefinite programming with nonnegative constraints
- Nearly unbiased variable selection under minimax concave penalty
- Title not available (Why is that?)
- Regression Quantiles
- Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties
- On the convergence properties of the EM algorithm
- Support-vector networks
- Variational Analysis
- Sparsity and Smoothness Via the Fused Lasso
- Convex analysis and monotone operator theory in Hilbert spaces
- Enhancing sparsity by reweighted \(\ell _{1}\) minimization
- MM optimization algorithms
- Title not available (Why is that?)
- Title not available (Why is that?)
- A Newton-CG augmented Lagrangian method for semidefinite programming
- Title not available (Why is that?)
- A Quadratically Convergent Newton Method for Computing the Nearest Correlation Matrix
- Title not available (Why is that?)
- Incremental majorization-minimization optimization with application to large-scale machine learning
- Proximal alternating linearized minimization for nonconvex and nonsmooth problems
- Finite-Dimensional Variational Inequalities and Complementarity Problems
- A nonsmooth version of Newton's method
- Semismooth and Semiconvex Functions in Constrained Optimization
- Convergence of descent methods for semi-algebraic and tame problems: proximal algorithms, forward-backward splitting, and regularized Gauss-Seidel methods
- Iterative Solution of Nonlinear Equations in Several Variables
- The DC (Difference of convex functions) programming and DCA revisited with DC models of real world nonconvex optimization problems
- Newton's Method for B-Differentiable Equations
- The Łojasiewicz Inequality for Nonsmooth Subanalytic Functions with Applications to Subgradient Dynamical Systems
- On the convergence of the proximal algorithm for nonsmooth functions involving analytic features
- A semismooth Newton method with multidimensional filter globalization for \(l_1\)-optimization
- DC approximation approaches for sparse optimization
- Minimization of \(\ell_{1-2}\) for compressed sensing
- On functions representable as a difference of convex functions
- On the structure of convex piecewise quadratic functions
- Structural properties of affine sparsity constraints
- Title not available (Why is that?)
- A globally convergent Newton method for convex \(SC^ 1\) minimization problems
- EXTENSION OF NEWTON AND QUASI-NEWTON METHODS TO SYSTEMS OF PC^1 EQUATIONS
- Multivariate convex regression with adaptive partitioning
- Computing B-stationary points of nonsmooth DC programs
- A Computational Framework for Multivariate Convex Regression and Its Variants
- Nonlinear programming
- Majorization-minimization procedures and convergence of SQP methods for semi-algebraic and tame programs
- Convergence analysis of difference-of-convex algorithm with subanalytic data
- A highly efficient semismooth Newton augmented Lagrangian method for solving lasso problems
- Difference-of-convex learning: directional stationarity, optimality, and sparsity
- A proximal difference-of-convex algorithm with extrapolation
- Robust multicategory support vector machines using difference convex algorithm
- Automatic speech recognition. A deep learning approach
- Enhanced proximal DC algorithms with extrapolation for a class of structured nonsmooth DC minimization
- An algorithm for the estimation of a regression function by continuous piecewise linear functions
Cited In (33)
- A fast and effective algorithm for sparse linear regression with \(\ell_p\)-norm data fidelity and elastic net regularization
- Markov chain stochastic DCA and applications in deep learning with PDEs regularization
- On the pervasiveness of difference-convexity in optimization and statistics
- On the superiority of PGMs to PDCAs in nonsmooth nonconvex sparse regression
- Max-affine regression via first-order methods
- Spectrahedral Regression
- Solving Nonsmooth and Nonconvex Compound Stochastic Programs with Applications to Risk Measure Minimization
- Asymptotic Properties of Stationary Solutions of Coupled Nonconvex Nonsmooth Empirical Risk Minimization
- Nonconvex and nonsmooth approaches for affine chance-constrained stochastic programs
- Multicomposite nonconvex optimization for training deep neural networks
- Approximations of semicontinuous functions with applications to stochastic optimization and statistical estimation
- Nonconvex robust programming via value-function optimization
- A global two-stage algorithm for non-convex penalized high-dimensional linear regression problems
- An augmented Lagrangian method with constraint generation for shape-constrained convex regression problems
- Exact guarantees on the absence of spurious local minima for non-negative rank-1 robust principal component analysis
- High-order optimization methods for fully composite problems
- Hybrid Algorithms for Finding a D-Stationary Point of a Class of Structured Nonsmooth DC Minimization
- Title not available (Why is that?)
- An inexact proximal majorization-minimization algorithm for remote sensing image stripe noise removal
- A study of piecewise linear-quadratic programs
- Estimation of Knots in Linear Spline Models
- Optimality conditions for locally Lipschitz optimization with \(l_0\)-regularization
- Frank-Wolfe-type methods for a class of nonconvex inequality-constrained problems
- Proximal distance algorithms: theory and practice
- Penalty and augmented Lagrangian methods for constrained DC programming
- Stochastic difference-of-convex-functions algorithms for nonconvex programming
- Manifold sampling for optimizing nonsmooth nonconvex compositions
- Estimation of individualized decision rules based on an optimized covariate-dependent equivalent of random outcomes
- Consistent approximations in composite optimization
- Stability and error analysis for optimization and generalized equations
- Strong metric (sub)regularity of Karush-Kuhn-Tucker mappings for piecewise linear-quadratic convex-composite optimization and the quadratic convergence of Newton's method
- An efficient semismooth Newton method for adaptive sparse signal recovery problems
- A study of convex convex-composite functions via infimal convolution with applications
Uses Software
This page was built for publication: Composite difference-MAX programs for modern statistical estimation problems
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q4562249)