Convex optimization on Banach spaces (Q285434): Difference between revisions
From MaRDI portal
Set OpenAlex properties. |
Changed an Item |
||
Property / arXiv ID | |||
Property / arXiv ID: 1401.0334 / rank | |||
Normal rank |
Revision as of 12:14, 18 April 2024
scientific article
Language | Label | Description | Also known as |
---|---|---|---|
English | Convex optimization on Banach spaces |
scientific article |
Statements
Convex optimization on Banach spaces (English)
0 references
19 May 2016
0 references
Optimization and specifically convex optimization are important subjects in applied mathematics due to their practical relevances, especially when very many variables are used and may be modified to seek for the optimum of a function. Routine methods to compute such (say) minima (normally only local minima) use first or even second derivatives, that is, gradients or Hesse matrices (for a Newton method), respectively, but in practice the former are often not available and the latter almost never. Therefore, algorithms that do not require derivative evaluations anymore (even if the functions to be minimized are theoretically sufficiently smooth) become more and more interesting. In this paper, such methods are studied in the general context of Banach spaces of functions to be minimized, without evaluation of even gradients. The (greedy) algorithms are presented and also analysed with respect to their convergence speeds.
0 references
sparse
0 references
optimization
0 references
greedy
0 references
Banach space
0 references
convergence rate
0 references
approximate evaluation
0 references