Convergence of a generalized subgradient method for nondifferentiable convex optimization (Q757242): Difference between revisions

From MaRDI portal
Importer (talk | contribs)
Created a new Item
 
Set OpenAlex properties.
 
(3 intermediate revisions by 3 users not shown)
Property / MaRDI profile type
 
Property / MaRDI profile type: MaRDI publication profile / rank
 
Normal rank
Property / cites work
 
Property / cites work: Q4133397 / rank
 
Normal rank
Property / cites work
 
Property / cites work: Q3805801 / rank
 
Normal rank
Property / cites work
 
Property / cites work: On Poljak's improved subgradient method / rank
 
Normal rank
Property / cites work
 
Property / cites work: Two-direction subgradient method for non-differentiable optimization problems / rank
 
Normal rank
Property / cites work
 
Property / cites work: An aggregate subgradient method for nonsmooth convex minimization / rank
 
Normal rank
Property / cites work
 
Property / cites work: Minimization of unsmooth functionals / rank
 
Normal rank
Property / cites work
 
Property / cites work: Q5566712 / rank
 
Normal rank
Property / cites work
 
Property / cites work: Q5655195 / rank
 
Normal rank
Property / cites work
 
Property / cites work: Q5187067 / rank
 
Normal rank
Property / cites work
 
Property / cites work: Q5181564 / rank
 
Normal rank
Property / full work available at URL
 
Property / full work available at URL: https://doi.org/10.1007/bf01594925 / rank
 
Normal rank
Property / OpenAlex ID
 
Property / OpenAlex ID: W2038295970 / rank
 
Normal rank
links / mardi / namelinks / mardi / name
 

Latest revision as of 08:54, 30 July 2024

scientific article
Language Label Description Also known as
English
Convergence of a generalized subgradient method for nondifferentiable convex optimization
scientific article

    Statements

    Convergence of a generalized subgradient method for nondifferentiable convex optimization (English)
    0 references
    0 references
    0 references
    1991
    0 references
    The generalized subgradient method proposed in this paper for minimizing a convex function f: \(R^ n\to R\) generates a sequence of points by an iteration of the form \[ x_{k+1}=x_ k-\sum_{i\in I_ k}s^ i_ kg_ i, \] where \(I_ k\subset \{1,...,k\}\), \(g_ i\in \partial f(x_ i)\) and \(s^ i_ k\geq 0\). Observing that this iteration can be written as \(x_{k+1}=x_ k-s_ kd_ k\), with appropriate \(s_ k\geq 0\) and \(d_ k\in \partial_{\epsilon_ k}f(x_ k)\) for some \(\epsilon_ k\geq 0\), the authors derive some convergence conditions from known convergence theorems for the \(\epsilon\)-subgradient method.
    0 references
    nondifferentiable optimization
    0 references
    generalized subgradient method
    0 references
    convergence conditions
    0 references
    \(\epsilon \) -subgradient
    0 references

    Identifiers

    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references