On the resolution of misspecified convex optimization and monotone variational inequality problems

From MaRDI portal
Publication:782913

DOI10.1007/S10589-020-00193-ZzbMATH Open1447.90028arXiv1408.5532OpenAlexW3035847440MaRDI QIDQ782913FDOQ782913

Hesam Ahmadi, Uday V. Shanbhag

Publication date: 29 July 2020

Published in: Computational Optimization and Applications (Search for Journal in Brave)

Abstract: We consider a misspecified optimization problem that requires minimizing a function f(x;q*) over a closed and convex set X where q* is an unknown vector of parameters that may be learnt by a parallel learning process. In this context, We examine the development of coupled schemes that generate iterates {x_k,q_k} as k goes to infinity, then {x_k} converges x*, a minimizer of f(x;q*) over X and {q_k} converges to q*. In the first part of the paper, we consider the solution of problems where f is either smooth or nonsmooth under various convexity assumptions on function f. In addition, rate statements are also provided to quantify the degradation in rate resulted from learning process. In the second part of the paper, we consider the solution of misspecified monotone variational inequality problems to contend with more general equilibrium problems as well as the possibility of misspecification in the constraints. We first present a constant steplength misspecified extragradient scheme and prove its asymptotic convergence. This scheme is reliant on problem parameters (such as Lipschitz constants)and leads us to present a misspecified variant of iterative Tikhonov regularization. Numerics support the asymptotic and rate statements.


Full work available at URL: https://arxiv.org/abs/1408.5532





Cites Work


Cited In (1)

Uses Software






This page was built for publication: On the resolution of misspecified convex optimization and monotone variational inequality problems

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q782913)