An efficient primal dual prox method for non-smooth optimization
DOI10.1007/S10994-014-5436-1zbMATH Open1311.90188arXiv1201.5283OpenAlexW2034410976MaRDI QIDQ2339936FDOQ2339936
Authors: Tianbao Yang, Mehrdad Mahdavi, Rong Jin, Shenghuo Zhu
Publication date: 14 April 2015
Published in: Machine Learning (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1201.5283
Recommendations
- Perturbed proximal primal-dual algorithm for nonconvex nonsmooth optimization
- scientific article; zbMATH DE number 7404502
- Accelerated proximal stochastic dual coordinate ascent for regularized loss minimization
- A smooth primal-dual optimization framework for nonsmooth composite convex minimization
- Nonsmoothness in machine learning: specific structure, proximal identification, and applications
Learning and adaptive systems in artificial intelligence (68T05) Nonsmooth analysis (49J52) Derivative-free methods and methods using generalized derivatives (90C56)
Cites Work
- A Fast Iterative Shrinkage-Thresholding Algorithm for Linear Inverse Problems
- The elements of statistical learning. Data mining, inference, and prediction
- Pegasos: primal estimated sub-gradient solver for SVM
- Title not available (Why is that?)
- Distributed optimization and statistical learning via the alternating direction method of multipliers
- Quantile regression.
- Splitting Algorithms for the Sum of Two Nonlinear Operators
- Model Selection and Estimation in Regression with Grouped Variables
- Smooth minimization of non-smooth functions
- Exact matrix completion via convex optimization
- Guaranteed minimum-rank solutions of linear matrix equations via nuclear norm minimization
- Title not available (Why is that?)
- Prox-Method with Rate of Convergence O(1/t) for Variational Inequalities with Lipschitz Continuous Monotone Operators and Smooth Convex-Concave Saddle Point Problems
- Excessive Gap Technique in Nonsmooth Convex Minimization
- A feature selection Newton method for support vector machine classification
- A first-order primal-dual algorithm for convex problems with applications to imaging
- Optimization with sparsity-inducing penalties
- A general framework for a class of first order primal-dual algorithms for convex optimization in imaging science
- Monotone Operators and the Proximal Point Algorithm
- Convergence analysis of primal-dual algorithms for a saddle-point problem: from contraction perspective
- Title not available (Why is that?)
- An optimal method for stochastic composite optimization
- Primal-dual splitting algorithm for solving inclusions with mixtures of composite, Lipschitzian, and parallel-sum type monotone operators
- Primal-dual first-order methods with \({\mathcal {O}(1/\varepsilon)}\) iteration-complexity for cone programming
- Efficient online and batch learning using forward backward splitting
- Convex multi-task feature learning
- ESTIMATING THE APPROXIMATION ERROR IN LEARNING THEORY
- Classification with a reject option using a hinge loss
- Are Loss Functions All the Same?
- SVM Soft Margin Classifiers: Linear Programming versus Quadratic Programming
- Title not available (Why is that?)
- A forward–backward splitting algorithm for the minimization of non-smooth convex functionals in Banach space
Cited In (13)
- Online composite optimization with time-varying regularizers
- The nearest polynomial to multiple given polynomials with a given zero: a unified optimization approach
- Efficient computation of the nearest polynomial by linearized alternating direction method
- Running Primal-Dual Gradient Method for Time-Varying Nonconvex Problems
- A Smooth Double Proximal Primal-Dual Algorithm for a Class of Distributed Nonsmooth Optimization Problems
- Point process estimation with Mirror Prox algorithms
- Approximately nearest neighborhood image search using unsupervised hashing via homogeneous kernels
- Title not available (Why is that?)
- An asynchronous subgradient-proximal method for solving additive convex optimization problems
- Non-stationary First-Order Primal-Dual Algorithms with Faster Convergence Rates
- RSG: Beating Subgradient Method without Smoothness and Strong Convexity
- Dualize, split, randomize: toward fast nonsmooth optimization algorithms
- Accelerate stochastic subgradient method by leveraging local growth condition
Uses Software
This page was built for publication: An efficient primal dual prox method for non-smooth optimization
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2339936)