On the operator Hermite-Hadamard inequality (Q2054444): Difference between revisions
From MaRDI portal
Latest revision as of 09:12, 27 July 2024
scientific article
Language | Label | Description | Also known as |
---|---|---|---|
English | On the operator Hermite-Hadamard inequality |
scientific article |
Statements
On the operator Hermite-Hadamard inequality (English)
0 references
2 December 2021
0 references
Assume that \(f:J\subseteq\mathbb{R}\to\mathbb{R}\) is a convex function. Then the following inequality holds: \[ f\left(\frac{a+b}{2}\right)\le\frac{1}{b-a}\int_a^b f(t)\,dt \le\frac{f(a)+f(b)}{2}\qquad(a, b\in J,~a<b). \] This inequality is called \textit{Hermite-Hadamard inequality}. Let \(\mathcal{H}\) be a Hilbert space and \(\mathbb{B}(\mathcal{H})\) be the \(C^*\)-algebra of all bounded linear operators on \(\mathcal{H}\). In the paper under review, the authors use the Mond-Pečarić method and prove that \[ \int_0^1f\left(A \nabla_t B\right)dt \le \beta I_{\mathcal{H}}+\alpha\big(g(A)\nabla g(B)\big), \] where \(A, B\in\mathbb{B}(\mathcal{H})\) are two selfadjoint satisfying \(mI_{\mathcal{H}} \le A, B \le MI_{\mathcal{H}}\), \(f, g:[m, M]\to\mathbb{R}\) are two continuous and convex functions, \(\alpha\ge 0\) and \(\beta=\max_{m\le x \le M}\left\{a_f x+b_f-\alpha g(x)\right\}\) are such that \[ a_f=\frac{f(M)-f(m)}{M-m},\quad b_f=\frac{Mf(m)-mf(M)}{M-m}. \] Further, the authors present a weighted generalization of the operator Hermite-Hadamard inequality as follows: \begin{align*} f\left(A \nabla_\lambda B\right) \le & \int_0^1f\Big(\left(A\nabla_\lambda B\right)\nabla_t A\Big)\nabla_\lambda f\Big(\left(A\nabla_\lambda B\right)\nabla_t B\Big)dt\\ \le & f(A) \nabla_\lambda f(B), \end{align*} where \(A, B\in \mathbb{B}(\mathcal{H})\) are two selfadjoint operators satisfying \(mI_{\mathcal{H}}\le A, B \le MI_{\mathcal{H}}\) and \(f:[m, M]\to\mathbb{R}\) is an operator convex function and \(0 \le \lambda \le 1\). Also, the authors give the reverse Hermite-Hadamard inequalities by using the Mond-Pečarić method and an operator Hermite-Hadamard inequality by using the gradient inequality.
0 references
Hermite-Hadamard inequality
0 references
Mond-Pečarić method
0 references
selfadjoint operator
0 references
convex function
0 references
0 references
0 references