scientific article; zbMATH DE number 3850830
From MaRDI portal
Publication:3320132
Recommendations
- Rate of convergence of the method of feasible directions, not necessarily using the direction of steepest descent
- A descent method with the use uf duality for the solution of a convex programming problem in a Hilbert space
- scientific article; zbMATH DE number 4057292
- scientific article; zbMATH DE number 3847229
- scientific article; zbMATH DE number 2102650
Cited in
(only showing first 100 items - show all)- On efficiently solving the subproblems of a level-set method for fused lasso problems
- Inexact Newton regularization combined with two-point gradient methods for nonlinear ill-posed problems
- On the efficient computation of a generalized Jacobian of the projector over the Birkhoff polytope
- The approximate duality gap technique: a unified theory of first-order methods
- Is there an analog of Nesterov acceleration for gradient-based MCMC?
- A stochastic subspace approach to gradient-free optimization in high dimensions
- Accelerated Bregman proximal gradient methods for relatively smooth convex optimization
- An accelerated first-order method with complexity analysis for solving cubic regularization subproblems
- Fast and safe: accelerated gradient methods with optimality certificates and underestimate sequences
- A hadamard-stable extension of courant's sequential method for convex extremal problems
- Variational image regularization with Euler's elastica using a discrete gradient scheme
- Efficient multiplicative noise removal method using isotropic second order total variation
- Complexity of gradient descent for multiobjective optimization
- GMRES-accelerated ADMM for quadratic objectives
- A second-order cone based approach for solving the trust-region subproblem and its variants
- Proximal Gradient Methods for Machine Learning and Imaging
- An extension of the second order dynamical system that models Nesterov's convex gradient method
- Nonconvex robust programming via value-function optimization
- An accelerated IRNN-iteratively reweighted nuclear norm algorithm for nonconvex nonsmooth low-rank minimization problems
- A second-order adaptive Douglas-Rachford dynamic method for maximal \(\alpha\)-monotone operators
- Nearly optimal first-order methods for convex optimization under gradient norm measure: an adaptive regularization approach
- Preconditioned accelerated gradient descent methods for locally Lipschitz smooth objectives with applications to the solution of nonlinear PDEs
- PDE acceleration: a convergence rate analysis and applications to obstacle problems
- Convergence rates of an inertial gradient descent algorithm under growth and flatness conditions
- Inertial proximal gradient methods with Bregman regularization for a class of nonconvex optimization problems
- Momentum and stochastic momentum for stochastic gradient, Newton, proximal point and subspace descent methods
- An inertial extrapolation method for solving generalized split feasibility problems in real Hilbert spaces
- Gradient descent finds the cubic-regularized nonconvex Newton step
- A fast two-point gradient method for solving non-smooth nonlinear ill-posed problems
- Fast Proximal Methods via Time Scaling of Damped Inertial Dynamics
- Sharpness, restart, and acceleration
- Tikhonov regularization of a second order dynamical system with Hessian driven damping
- On the nonergodic convergence rate of an inexact augmented Lagrangian framework for composite convex programming
- Some accelerated alternating proximal gradient algorithms for a class of nonconvex nonsmooth problems
- Multicomposite nonconvex optimization for training deep neural networks
- Convergence rates for an inertial algorithm of gradient type associated to a smooth non-convex minimization
- Potential reduction method for harmonically convex programming
- Accelerated gradient-free optimization methods with a non-Euclidean proximal operator
- Sublinear-time quadratic minimization via spectral decomposition of matrices
- An accelerated method for nonlinear elliptic PDE
- Robust accelerated gradient methods for smooth strongly convex functions
- Optimal convergence rates for Nesterov acceleration
- An inertial Mann forward-backward splitting algorithm of variational inclusion problems and its applications
- Clustering of fuzzy data and simultaneous feature selection: a model selection approach
- Sparse adaptive parameterization of variability in image ensembles
- A wavelet frame approach for removal of mixed Gaussian and impulse noise on surfaces
- Learning partial differential equations via data discovery and sparse optimization
- Viscosity \(S\)-iteration method with inertial technique and self-adaptive step size for split variational inclusion, equilibrium and fixed point problems
- Analysis of optimization algorithms via integral quadratic constraints: nonstrongly convex problems
- Contrast invariant SNR and isotonic regressions
- Primal-dual accelerated gradient methods with small-dimensional relaxation oracle
- A nonmonotone gradient algorithm for total variation image denoising problems
- Convergence rates of a dual gradient method for constrained linear ill-posed problems
- Asymptotic equivalence of evolution equations governed by cocoercive operators and their forward discretizations
- A dimension reduction technique for large-scale structured sparse optimization problems with application to convex clustering
- Lower bounds for finding stationary points I
- scientific article; zbMATH DE number 7306860 (Why is no real title available?)
- Directional total generalized variation regularization
- Nesterov’s accelerated gradient method for nonlinear ill-posed problems with a locally convex residual functional
- Alternating minimization methods for strongly convex optimization
- On the convergence of the iterates of proximal gradient algorithm with extrapolation for convex nonsmooth minimization problems
- Two optimization approaches for solving split variational inclusion problems with applications
- Provable accelerated gradient method for nonconvex low rank optimization
- Restarting the accelerated coordinate descent method with a rough strong convexity estimate
- Solving the OSCAR and SLOPE models using a semismooth Newton-based augmented Lagrangian method
- A finite element/operator-splitting method for the numerical solution of the two dimensional elliptic Monge-Ampère equation
- Accelerated gradient boosting
- Robust least square semidefinite programming with applications
- Performance of first-order methods for smooth convex minimization: a novel approach
- Approximation accuracy, gradient methods, and error bound for structured convex optimization
- Random algorithms for convex minimization problems
- Regularized estimation and testing for high-dimensional multi-block vector-autoregressive models
- iPiasco: inertial proximal algorithm for strongly convex optimization
- Domain adaptation and sample bias correction theory and algorithm for regression
- Solving structured nonsmooth convex optimization with complexity \(\mathcal {O}(\varepsilon ^{-1/2})\)
- Operator splittings, Bregman methods and frame shrinkage in image processing
- Efficient valuation of SCR via a neural network approach
- A convergent least-squares regularized blind deconvolution approach
- A differential variational approach for handling fluid-solid interaction problems via smoothed particle hydrodynamics
- Smooth strongly convex interpolation and exact worst-case performance of first-order methods
- Optimal subgradient methods: computational properties for large-scale linear inverse problems
- Clustering and feature selection using sparse principal component analysis
- Deep Learning--Based Dictionary Learning and Tomographic Image Reconstruction
- Image restoration with mixed or unknown noises
- On the second-order asymptotical regularization of linear ill-posed inverse problems
- Accelerated first-order primal-dual proximal methods for linearly constrained composite convex programming
- Exact worst-case performance of first-order methods for composite convex optimization
- A projected gradient and constraint linearization method for nonlinear model predictive control
- Certification aspects of the fast gradient method for solving the dual of parametric convex programs
- Projected subgradient minimization versus superiorization
- On the proximal gradient algorithm with alternated inertia
- Augmented Lagrangian algorithms for linear programming
- Sparse PCA: convex relaxations, algorithms and applications
- An efficient primal-dual method for the obstacle problem
- Accelerated sparse recovery via gradient descent with nonlinear conjugate gradient momentum
- Phase recovery, MaxCut and complex semidefinite programming
- A remark on accelerated block coordinate descent for computing the proximity operators of a sum of convex functions
- An Accelerated Level-Set Method for Inverse Scattering Problems
- Proximal splitting methods in signal processing
- Accelerated Bregman method for linearly constrained \(\ell _1-\ell _2\) minimization
This page was built for publication:
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q3320132)