Subgradient projectors: extensions, theory, and characterizations (Q1711098)

From MaRDI portal
scientific article
Language Label Description Also known as
English
Subgradient projectors: extensions, theory, and characterizations
scientific article

    Statements

    Subgradient projectors: extensions, theory, and characterizations (English)
    0 references
    0 references
    0 references
    0 references
    0 references
    16 January 2019
    0 references
    The article, of monography, by Heinz H. Bauschke, Caifang Wang, Xianfu Wang and Jia Xu is a very deep contribution to nonlinear analysis and geometry, right at the crossroad of future advances in the theories of Inverse Problems, Optimization and Optimal Control, Data Mining, Analytics, etc. Based on the power and the meaning of the results achieved by the authors, future investigations can be designed and started, and real-world applications made in nearly every branch of science, engineering, economic, management, Operational Research, medicine, neuroscience, earth- and environmental sciences. \par In fact, subgradient projectors are of central importance both in optimization and for treating convex feasibility problems. For any locally Lipschitz function, the authors can introduce a subgradient projector by the help of generalized subgradients, not just if the function is convex, but also if it is nonconvex. This article is composed of 3 parts. In Part I, the authors investigate fundamental conditions fulfilled by subgradient projectors, and they provide characterizations for a subgradient projector being a ``cutter'', a ``local cutter'' or a quasi-nonexpansive mapping. The authors offer both a global and a local convergence analysis for subgradent projectors. To illustrate the theory, numerous examples are given. In Part II, the authors analyze the relationship among subgradient projector of a prox-regular function and subgradient projector of its Moreau envelope. Furthermore, they provide a characterization for the condition of a mapping being the subgradient projector of some convex function. In Part III, they focus on properties of subgradient projectors. In the presence of meaningful conditions, a linear operator being a subgradient projector of a convex function is equivalently characterized by being a convex combination of both identity operator and projection operator onto some subspace. In fact, generally, a convex combination and a composition of subgradient projectors of convex functions need not be subgradient projectors of a convex function. \par This deep paper is timely, well-integrated into the academic landscape, allowing for future promises, it is well documented and illustrated, and carefully written. \par The eleven sections of this deep and valuable work are as follows: 1. Introduction, 2. An Extension of Subgradient Projector via Limiting Subgradients, 3. Calculus for Subgradient Projectors, 4. Basic Properties of Subgradient Projectors, 5. When is the Subgradient Projector $G_f$ a Cutter or Local Cutter?, 6. Convergence Analysis of Subgadient Projectors, 7. Subgradient Projectors of Moreau Envelopes, 8. Characterization of Subgradient Projectors of Convex Functions, 9. Characterizations of $G_{f,s}$ When $G_{f,s}$ is Linear, 10. Subgradient Projectors of Convex Functions are Not Closed Under Convex Combinations and Compositions, and 11. A Complete Analysis of Linear Subgradient Projectors on IR2. \par In the future, refinements and improvements in results, approaches, but also new methods and implementations can be awaited in the research family, inspired and initiated by this research monography. Those future contributions might be made with regard to (i) further and special forms of nonlinearity, nondifferentiability, impulsiveness, uncertainty and stochasticity in nonlinear geometry, analysis, optimization and optimal control, (ii) improvements and new extensions of singularity theory and Morse theory, (iii) various forms of involving the time (discrete and continuous time, change-of-time method, etc.), and various new forms of multi-objective optimization and newest methods (such as, optimization and robust optimization of Desirability Functions, Tikhonov Regularization and Ridge Regression, MARS, CMARS, R(C)MARS). \par Such emerging advances could foster vast advances in natural sciences, engineering and medicine, in neuroscience, healthcare and social sciences, in economics and game theory, in image, speech and video processing, and in bio-, earth- and environmental sciences.
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    approximately convex function
    0 references
    averaged mapping
    0 references
    cutter essentially strictly differentiable function
    0 references
    fixed point
    0 references
    limiting subgradient
    0 references
    local cutter
    0 references
    local quasi-firmly nonexpansive mapping
    0 references
    local quasi-nonexpansive mapping
    0 references
    local Lipschitz function
    0 references
    linear cutter
    0 references
    linear firmly nonexpansive mapping
    0 references
    linear subgradient projection operator
    0 references
    Moreau envelope
    0 references
    projection
    0 references
    prox-bounded
    0 references
    proximal mapping
    0 references
    prox-regular function
    0 references
    quasi-firmly nonexpansive mapping
    0 references
    quasi-nonexpansive mapping
    0 references
    \((C, \varepsilon)\)-firmly nonexpansive mapping
    0 references
    subdifferentiable function
    0 references
    subgradient projection operator
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references