Pages that link to "Item:Q5254990"
From MaRDI portal
The following pages link to Incremental Majorization-Minimization Optimization with Application to Large-Scale Machine Learning (Q5254990):
Displaying 49 items.
- Stream-suitable optimization algorithms for some soft-margin support vector machine variants (Q145488) (← links)
- The log-exponential smoothing technique and Nesterov's accelerated gradient method for generalized Sylvester problems (Q274096) (← links)
- On the linear convergence of the approximate proximal splitting method for non-smooth convex optimization (Q489108) (← links)
- Nonconvex nonsmooth optimization via convex-nonconvex majorization-minimization (Q530079) (← links)
- Stochastic variance reduced gradient methods using a trust-region-like scheme (Q1995995) (← links)
- Incremental quasi-subgradient methods for minimizing the sum of quasi-convex functions (Q2010105) (← links)
- Generalized stochastic Frank-Wolfe algorithm with stochastic ``substitute'' gradient for structured convex optimization (Q2020608) (← links)
- Linear convergence of inexact descent method and inexact proximal gradient algorithms for lower-order regularization problems (Q2022292) (← links)
- Stochastic quasi-gradient methods: variance reduction via Jacobian sketching (Q2039235) (← links)
- An outer-inner linearization method for non-convex and nondifferentiable composite regularization problems (Q2046332) (← links)
- Stochastic DCA for minimizing a large sum of DC functions with application to multi-class logistic regression (Q2057761) (← links)
- A hybrid stochastic optimization framework for composite nonconvex optimization (Q2118109) (← links)
- Block-coordinate and incremental aggregated proximal gradient methods for nonsmooth nonconvex problems (Q2133414) (← links)
- A generalized proximal linearized algorithm for DC functions with application to the optimal size of the firm problem (Q2158622) (← links)
- Accelerating incremental gradient optimization with curvature information (Q2181597) (← links)
- Linear convergence of cyclic SAGA (Q2193004) (← links)
- Improved SVRG for finite sum structure optimization with application to binary classification (Q2244210) (← links)
- Optimizing cluster structures with inner product induced norm based dissimilarity measures: theoretical development and convergence analysis (Q2282276) (← links)
- Majorization-minimization generalized Krylov subspace methods for \({\ell _p}\)-\({\ell _q}\) optimization applied to image restoration (Q2359752) (← links)
- Proximal average approximated incremental gradient descent for composite penalty regularized empirical risk minimization (Q2398094) (← links)
- Convergence rates of accelerated proximal gradient algorithms under independent noise (Q2420162) (← links)
- Generalized forward-backward splitting with penalization for monotone inclusion problems (Q2423787) (← links)
- Coordinate descent with arbitrary sampling I: algorithms and complexity<sup>†</sup> (Q2829565) (← links)
- Adaptive Sampling for Incremental Optimization Using Stochastic Gradient Descent (Q2835640) (← links)
- (Q4558169) (← links)
- Catalyst Acceleration for First-order Convex Optimization: from Theory to Practice (Q4558545) (← links)
- (Q4558559) (← links)
- Composite Difference-Max Programs for Modern Statistical Estimation Problems (Q4562249) (← links)
- (Q4637046) (← links)
- Global Convergence Rate of Proximal Incremental Aggregated Gradient Methods (Q4641660) (← links)
- Surpassing Gradient Descent Provably: A Cyclic Incremental Method with Linear Convergence Rate (Q4641666) (← links)
- A Coordinate-Descent Primal-Dual Algorithm with Large Step Size and Possibly Nonseparable Functions (Q4646445) (← links)
- Proximal-Like Incremental Aggregated Gradient Method with Linear Convergence Under Bregman Distance Growth Conditions (Q4991666) (← links)
- (Q5054622) (← links)
- Modulus-based iterative methods for constrained <b> <i>ℓ</i> </b> <sub> <i>p</i> </sub>–<b> <i>ℓ</i> </b> <sub> <i>q</i> </sub> minimization (Q5117383) (← links)
- (Q5148937) (← links)
- Stochastic proximal quasi-Newton methods for non-convex composite optimization (Q5198046) (← links)
- Stochastic sub-sampled Newton method with variance reduction (Q5204645) (← links)
- An Inexact Variable Metric Proximal Point Algorithm for Generic Quasi-Newton Acceleration (Q5231671) (← links)
- Riemannian Stochastic Variance Reduced Gradient Algorithm with Retraction and Vector Transport (Q5231672) (← links)
- IQN: An Incremental Quasi-Newton Method with Local Superlinear Convergence Rate (Q5745078) (← links)
- A Bregman Forward-Backward Linesearch Algorithm for Nonconvex Composite Optimization: Superlinear Convergence to Nonisolated Local Minima (Q5853567) (← links)
- Incremental Quasi-Subgradient Method for Minimizing Sum of Geodesic Quasi-Convex Functions on Riemannian Manifolds with Applications (Q5861980) (← links)
- Bregman Finito/MISO for Nonconvex Regularized Finite Sum Minimization without Lipschitz Gradient Continuity (Q5869813) (← links)
- Stochastic Difference-of-Convex-Functions Algorithms for Nonconvex Programming (Q5869814) (← links)
- An aggressive reduction on the complexity of optimization for non-strongly convex objectives (Q6052286) (← links)
- Efficiency of higher-order algorithms for minimizing composite functions (Q6155068) (← links)
- Random-reshuffled SARAH does not need full gradient computations (Q6204201) (← links)
- SPIRAL: a superlinearly convergent incremental proximal algorithm for nonconvex finite sum minimization (Q6498409) (← links)