A two-step fixed-point proximity algorithm for a class of non-differentiable optimization models in machine learning
From MaRDI portal
Publication:2333729
Recommendations
- A fixed-point proximity approach to solving the support vector regression with the group Lasso regularization
- An efficient primal dual prox method for non-smooth optimization
- On the proximal Landweber Newton method for a class of nonsmooth convex problems
- Nonsmoothness in machine learning: specific structure, proximal identification, and applications
- Two-step fixed-point proximity algorithms for multi-block separable convex problems
Cites work
- scientific article; zbMATH DE number 5430994 (Why is no real title available?)
- scientific article; zbMATH DE number 1332320 (Why is no real title available?)
- scientific article; zbMATH DE number 845714 (Why is no real title available?)
- scientific article; zbMATH DE number 6438182 (Why is no real title available?)
- A comparison of optimization methods and software for large-scale L1-regularized linear classifi\-cation
- A feature selection Newton method for support vector machine classification
- A fixed-point proximity approach to solving the support vector regression with the group Lasso regularization
- A proximity algorithm accelerated by Gauss-Seidel iterations for L1/TV denoising models
- Distributed optimization and statistical learning via the alternating direction method of multipliers
- Functional reproducing kernel Hilbert spaces for non-point-evaluation functional data
- Generalized Mercer kernels and reproducing kernel Banach spaces
- Model Selection and Estimation in Regression with Grouped Variables
- Multi-step fixed-point proximity algorithms for solving a class of optimization problems arising from image processing
- Multi-task learning in vector-valued reproducing kernel Banach spaces with the \(\ell^1\) norm
- Optimization with sparsity-inducing penalties
- Proximity algorithms for image models: denoising
- Proximity algorithms for the L1/TV image denoising model
- Reproducing kernel Banach spaces for machine learning
- Reproducing kernel Banach spaces with the ^1 norm
- Reproducing kernel Banach spaces with the \(\ell^{1}\) norm. II: Error analysis for regularized least square regression
- Sparse support vector machines in reproducing kernel Banach spaces
- Support-vector networks
- The Group Lasso for Logistic Regression
- Two-step fixed-point proximity algorithms for multi-block separable convex problems
- Universal kernels
Cited in
(7)- A Douglas-Rachford method for sparse extreme learning machine
- A fixed-point proximity approach to solving the support vector regression with the group Lasso regularization
- scientific article; zbMATH DE number 7626740 (Why is no real title available?)
- A duality approach to regularized learning problems in Banach spaces
- Inexact fixed-point proximity algorithm for the \(\ell_0\) sparse regularization problem
- Convergence rate analysis for fixed-point iterations of generalized averaged nonexpansive operators
- Parameter choices for sparse regularization with the ℓ1 norm *
This page was built for publication: A two-step fixed-point proximity algorithm for a class of non-differentiable optimization models in machine learning
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2333729)