A two-step fixed-point proximity algorithm for a class of non-differentiable optimization models in machine learning
DOI10.1007/S10915-019-01045-7zbMATH Open1423.68382OpenAlexW2972293876WikidataQ127308543 ScholiaQ127308543MaRDI QIDQ2333729FDOQ2333729
Authors: Zheng Li, Guohui Song, Yuesheng Xu
Publication date: 13 November 2019
Published in: Journal of Scientific Computing (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s10915-019-01045-7
Recommendations
- A fixed-point proximity approach to solving the support vector regression with the group Lasso regularization
- An efficient primal dual prox method for non-smooth optimization
- On the proximal Landweber Newton method for a class of nonsmooth convex problems
- Nonsmoothness in machine learning: specific structure, proximal identification, and applications
- Two-step fixed-point proximity algorithms for multi-block separable convex problems
machine learningsparse learningnon-differentiable optimizationproximity algorithmstwo-step iteration methods
Cites Work
- Title not available (Why is that?)
- Distributed optimization and statistical learning via the alternating direction method of multipliers
- Support-vector networks
- Model Selection and Estimation in Regression with Grouped Variables
- The Group Lasso for Logistic Regression
- Title not available (Why is that?)
- Title not available (Why is that?)
- A feature selection Newton method for support vector machine classification
- Proximity algorithms for the L1/TV image denoising model
- Optimization with sparsity-inducing penalties
- Proximity algorithms for image models: denoising
- Reproducing kernel Banach spaces for machine learning
- Universal kernels
- Multi-step fixed-point proximity algorithms for solving a class of optimization problems arising from image processing
- A proximity algorithm accelerated by Gauss-Seidel iterations for L1/TV denoising models
- A comparison of optimization methods and software for large-scale L1-regularized linear classifi\-cation
- Title not available (Why is that?)
- Reproducing kernel Banach spaces with the \(\ell^1\) norm
- Reproducing kernel Banach spaces with the \(\ell^{1}\) norm. II: Error analysis for regularized least square regression
- Generalized Mercer kernels and reproducing kernel Banach spaces
- Two-step fixed-point proximity algorithms for multi-block separable convex problems
- Sparse support vector machines in reproducing kernel Banach spaces
- Functional reproducing kernel Hilbert spaces for non-point-evaluation functional data
- Multi-task learning in vector-valued reproducing kernel Banach spaces with the \(\ell^1\) norm
- A fixed-point proximity approach to solving the support vector regression with the group Lasso regularization
Cited In (7)
- Title not available (Why is that?)
- A fixed-point proximity approach to solving the support vector regression with the group Lasso regularization
- A Douglas-Rachford method for sparse extreme learning machine
- A duality approach to regularized learning problems in Banach spaces
- Inexact fixed-point proximity algorithm for the \(\ell_0\) sparse regularization problem
- Convergence rate analysis for fixed-point iterations of generalized averaged nonexpansive operators
- Parameter choices for sparse regularization with the ℓ1 norm *
Uses Software
This page was built for publication: A two-step fixed-point proximity algorithm for a class of non-differentiable optimization models in machine learning
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2333729)