scientific article; zbMATH DE number 3850830
From MaRDI portal
Publication:3320132
Recommendations
- Rate of convergence of the method of feasible directions, not necessarily using the direction of steepest descent
- A descent method with the use uf duality for the solution of a convex programming problem in a Hilbert space
- scientific article; zbMATH DE number 4057292
- scientific article; zbMATH DE number 3847229
- scientific article; zbMATH DE number 2102650
Cited in
(only showing first 100 items - show all)- On efficiently solving the subproblems of a level-set method for fused lasso problems
- Inexact Newton regularization combined with two-point gradient methods for nonlinear ill-posed problems
- On the efficient computation of a generalized Jacobian of the projector over the Birkhoff polytope
- The approximate duality gap technique: a unified theory of first-order methods
- Is there an analog of Nesterov acceleration for gradient-based MCMC?
- A stochastic subspace approach to gradient-free optimization in high dimensions
- Accelerated Bregman proximal gradient methods for relatively smooth convex optimization
- An accelerated first-order method with complexity analysis for solving cubic regularization subproblems
- Fast and safe: accelerated gradient methods with optimality certificates and underestimate sequences
- A hadamard-stable extension of courant's sequential method for convex extremal problems
- Variational image regularization with Euler's elastica using a discrete gradient scheme
- Efficient multiplicative noise removal method using isotropic second order total variation
- Complexity of gradient descent for multiobjective optimization
- GMRES-accelerated ADMM for quadratic objectives
- A second-order cone based approach for solving the trust-region subproblem and its variants
- Proximal Gradient Methods for Machine Learning and Imaging
- An extension of the second order dynamical system that models Nesterov's convex gradient method
- Nonconvex robust programming via value-function optimization
- An accelerated IRNN-iteratively reweighted nuclear norm algorithm for nonconvex nonsmooth low-rank minimization problems
- A second-order adaptive Douglas-Rachford dynamic method for maximal \(\alpha\)-monotone operators
- Nearly optimal first-order methods for convex optimization under gradient norm measure: an adaptive regularization approach
- Preconditioned accelerated gradient descent methods for locally Lipschitz smooth objectives with applications to the solution of nonlinear PDEs
- PDE acceleration: a convergence rate analysis and applications to obstacle problems
- Convergence rates of an inertial gradient descent algorithm under growth and flatness conditions
- Inertial proximal gradient methods with Bregman regularization for a class of nonconvex optimization problems
- Momentum and stochastic momentum for stochastic gradient, Newton, proximal point and subspace descent methods
- An inertial extrapolation method for solving generalized split feasibility problems in real Hilbert spaces
- Gradient descent finds the cubic-regularized nonconvex Newton step
- A fast two-point gradient method for solving non-smooth nonlinear ill-posed problems
- Fast Proximal Methods via Time Scaling of Damped Inertial Dynamics
- Sharpness, restart, and acceleration
- Tikhonov regularization of a second order dynamical system with Hessian driven damping
- On the nonergodic convergence rate of an inexact augmented Lagrangian framework for composite convex programming
- Some accelerated alternating proximal gradient algorithms for a class of nonconvex nonsmooth problems
- Multicomposite nonconvex optimization for training deep neural networks
- Convergence rates for an inertial algorithm of gradient type associated to a smooth non-convex minimization
- Potential reduction method for harmonically convex programming
- Accelerated gradient-free optimization methods with a non-Euclidean proximal operator
- Sublinear-time quadratic minimization via spectral decomposition of matrices
- An accelerated method for nonlinear elliptic PDE
- Robust accelerated gradient methods for smooth strongly convex functions
- Optimal convergence rates for Nesterov acceleration
- An inertial Mann forward-backward splitting algorithm of variational inclusion problems and its applications
- Clustering of fuzzy data and simultaneous feature selection: a model selection approach
- Sparse adaptive parameterization of variability in image ensembles
- A wavelet frame approach for removal of mixed Gaussian and impulse noise on surfaces
- Learning partial differential equations via data discovery and sparse optimization
- Viscosity \(S\)-iteration method with inertial technique and self-adaptive step size for split variational inclusion, equilibrium and fixed point problems
- Analysis of optimization algorithms via integral quadratic constraints: nonstrongly convex problems
- Contrast invariant SNR and isotonic regressions
- Primal-dual accelerated gradient methods with small-dimensional relaxation oracle
- A nonmonotone gradient algorithm for total variation image denoising problems
- Convergence rates of a dual gradient method for constrained linear ill-posed problems
- Asymptotic equivalence of evolution equations governed by cocoercive operators and their forward discretizations
- A dimension reduction technique for large-scale structured sparse optimization problems with application to convex clustering
- Lower bounds for finding stationary points I
- scientific article; zbMATH DE number 7306860 (Why is no real title available?)
- Directional total generalized variation regularization
- Nesterov’s accelerated gradient method for nonlinear ill-posed problems with a locally convex residual functional
- Alternating minimization methods for strongly convex optimization
- On the convergence of the iterates of proximal gradient algorithm with extrapolation for convex nonsmooth minimization problems
- Two optimization approaches for solving split variational inclusion problems with applications
- Provable accelerated gradient method for nonconvex low rank optimization
- Restarting the accelerated coordinate descent method with a rough strong convexity estimate
- Solving the OSCAR and SLOPE models using a semismooth Newton-based augmented Lagrangian method
- A finite element/operator-splitting method for the numerical solution of the two dimensional elliptic Monge-Ampère equation
- Strong Convergence of Trajectories via Inertial Dynamics Combining Hessian-Driven Damping and Tikhonov Regularization for General Convex Minimizations
- Improved convergence rates and trajectory convergence for primal-dual dynamical systems with vanishing damping
- Fast and stable nonconvex constrained distributed optimization: the ELLADA algorithm
- Dualization and Automatic Distributed Parameter Selection of Total Generalized Variation via Bilevel Optimization
- An accelerated common fixed point algorithm for a countable family of \(G\)-nonexpansive mappings with applications to image recovery
- Generalized mirror prox algorithm for monotone variational inequalities: Universality and inexact oracle
- Modified hybrid projection methods with SP iterations for quasi-nonexpansive multivalued mappings in Hilbert spaces
- An inertial semi-forward-reflected-backward splitting and its application
- Accelerated sampling Kaczmarz Motzkin algorithm for the linear feasibility problem
- Fast convergence of dynamical ADMM via time scaling of damped inertial dynamics
- An accelerated Uzawa method for application to frictionless contact problem
- Scaled, inexact, and adaptive generalized FISTA for strongly convex optimization
- New Bregman proximal type algoritms for solving DC optimization problems
- Accelerated additive Schwarz methods for convex optimization with adaptive restart
- Accelerated randomized mirror descent algorithms for composite non-strongly convex optimization
- An inexact variable metric proximal point algorithm for generic quasi-Newton acceleration
- Inertial accelerated primal-dual methods for linear equality constrained convex optimization problems
- On obtaining sparse semantic solutions for inverse problems, control, and neural network training
- First-order inertial algorithms involving dry friction damping
- An accelerated forward-backward algorithm with applications to image restoration problems
- Accelerated optimization on Riemannian manifolds via discrete constrained variational integrators
- scientific article; zbMATH DE number 7370534 (Why is no real title available?)
- Accelerated proximal gradient method for bi-modulus static elasticity
- A parallel Tseng's splitting method for solving common variational inclusion applied to signal recovery problems
- Inertial, corrected, primal-dual proximal splitting
- A fast continuous time approach with time scaling for nonsmooth convex optimization
- A nested primal-dual FISTA-like scheme for composite convex optimization problems
- Fast inertial extragradient algorithms for solving non-Lipschitzian equilibrium problems without monotonicity condition in real Hilbert spaces
- Generating Nesterov's accelerated gradient algorithm by using optimal control theory for optimization
- Faster Lagrangian-based methods in convex optimization
- Convergence rates of inertial primal-dual dynamical methods for separable convex optimization problems
- Fast inertial dynamic algorithm with smoothing method for nonsmooth convex optimization
- A dual approach for optimal algorithms in distributed optimization over networks
- A new method with regularization for solving split variational inequality problems in real Hilbert spaces
This page was built for publication:
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q3320132)