Subsampled first-order optimization methods with applications in imaging
From MaRDI portal
Publication:6606441
first-order methodsstochastic gradientneural networksconvolutional neural networksimage classificationfinite-sum minimization
Learning and adaptive systems in artificial intelligence (68T05) Applications of mathematical programming (90C90) Nonconvex programming, global optimization (90C26) Image processing (compression, reconstruction, etc.) in information and communication theory (94A08) Computing methodologies for image processing (68U10)
Recommendations
- An introduction to continuous optimization for imaging
- First-order methods for convex optimization
- Subsampled nonmonotone spectral gradient methods
- A first-order primal-dual algorithm for convex problems with applications to imaging
- Subsampled inexact Newton methods for minimizing large sums of convex functions
Cites work
- scientific article; zbMATH DE number 5070674 (Why is no real title available?)
- A Scaled Stochastic Approximation Algorithm
- A Stochastic Approximation Method
- A gradient method for unconstrained optimization in noisy environment
- A stochastic line search method with expected complexity analysis
- A stochastic quasi-Newton method for large-scale optimization
- Accelerated Stochastic Approximation
- Accelerated Stochastic Approximation
- Adaptive Regularization Algorithms with Inexact Evaluations for Nonconvex Optimization
- Adaptive cubic regularization methods with dynamic inexact Hessian information and applications to finite-sum minimization
- An introduction to matrix concentration inequalities
- An investigation of Newton-sketch and subsampled Newton methods
- Deep learning
- Descent direction method with line search for unconstrained optimization in noisy environment
- Efficient sample sizes in stochastic nonlinear programming
- Exact and inexact subsampled Newton methods for optimization
- Global convergence rate analysis of unconstrained optimization methods based on probabilistic models
- Gradient Convergence in Gradient methods with Errors
- Hybrid deterministic-stochastic methods for data fitting
- Inexact restoration approach for minimization with inexact evaluation of the objective function
- Inexact restoration with subsampled trust-region methods for finite-sum minimization
- Introduction to Stochastic Search and Optimization
- Learning representations by back-propagating errors
- Line search methods with variable sample size for unconstrained optimization
- Linear algebra and learning from data
- Minimization of functions having Lipschitz continuous first partial derivatives
- Minimizing finite sums with the stochastic average gradient
- Neural networks and deep learning. A textbook
- New stochastic approximation algorithms with adaptive step sizes
- Newton Sketch: A Near Linear-Time Optimization Algorithm with Linear-Quadratic Convergence
- Newton-type methods for non-convex optimization under inexact Hessian information
- Nonlinear stepsize control, trust regions and regularizations for unconstrained optimization
- Nonmonotone line search methods with variable sample size
- Numerical Optimization
- On stochastic gradient and subgradient methods with adaptive steplength sequences
- On the behavior of the gradient norm in the steepest descent method
- On the employment of inexact restoration for the minimization of functions whose evaluation is subject to errors
- Optimization methods for large-scale machine learning
- Pattern recognition and machine learning.
- Robust Stochastic Approximation Approach to Stochastic Programming
- Sample size selection in optimization methods for machine learning
- Stochastic Estimation of the Maximum of a Regression Function
- Stochastic optimization using a trust-region method and random models
- Sub-sampled Newton methods
- Subsampled inexact Newton methods for minimizing large sums of convex functions
- The Barzilai and Borwein Gradient Method for the Large Scale Unconstrained Minimization Problem
- The elements of statistical learning. Data mining, inference, and prediction
- Two-Point Step Size Gradient Methods
This page was built for publication: Subsampled first-order optimization methods with applications in imaging
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6606441)