The KLS isoperimetric conjecture for generalized Orlicz balls (Q1621450)

From MaRDI portal
Revision as of 08:37, 30 July 2024 by Openalex240730090724 (talk | contribs) (Set OpenAlex properties.)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
scientific article
Language Label Description Also known as
English
The KLS isoperimetric conjecture for generalized Orlicz balls
scientific article

    Statements

    The KLS isoperimetric conjecture for generalized Orlicz balls (English)
    0 references
    0 references
    8 November 2018
    0 references
    The paper under review establishes the Kannan-Lovasz-Simonovits (KLS) isoperimetric conjecture for generalized Orlicz balls. Given a separable metric space \((X, d)\) with a Borel probability measure \(\mu\), the Cheeger constant is defined by \[ D_{\mathrm{Che}}(X, d, \mu)= \inf_{A\subset X} \frac{\mu^+(A)}{\min(\mu (A), 1-\mu (A))}, \] for all Borel sets \(A\). Denote \(D_{\mathrm{Che}}(\mu) = D_{\mathrm{Che}}(\mathbb{R}^n, |\cdot |, \mu)\) for Euclidean metric \(|\cdot |\) in \(X=\mathbb{R}^n\), and \(D_{\mathrm{Che}}^{\mathrm{Lin}}(\mu)\) for only half space \(A=H \subset \mathbb{R}^n\) in \(D_{\mathrm{Che}}(\mu)\). Hence, \(D_{\mathrm{Che}}(\mu) \le D_{\mathrm{Che}}^{\mathrm{Lin}}(\mu)\) in general. If \(\mu=\lambda_K\) is a uniform Lebesgue probability measure on a convex compact set with nonempty interiors, Kannan, Lovasz and Simonovits [\textit{R. Kannan} et al., Discrete Comput. Geom. 13, No. 3--4, 541--559 (1995; Zbl 0824.52012)] conjectured that \[ c D_{\mathrm{Che}}^{\mathrm{Lin}}(\lambda_K) \le D_{\mathrm{Che}}(\lambda_K), \] for some universal numeric constant \(c>0\) independent of any other parameter such as \(n\) or \(K\). The authors KLS conjectured an essential equivalence between the former nonlinear isoperimetric inequality and its latter linear relaxation, it has been shown over the last two decades to be of fundamental importance to the understanding of volumetric and spectral properties of convex domains, revealing numerous connections to other central conjectures on the concentration of volume in convex bodies. For the Poincaré inequality \(\|f - \int f d\mu\|_{L^2(\mu)} \le D_{\mathrm{Poin}}(\mu) \|\nabla f \|_{L^2(\mu)}\), we have \(D_{\mathrm{Poin}}(\lambda_K) = \frac{1}{\sqrt{\lambda_1(K)}}\) for the first non-zero eigenvalue of the Neumann Laplacian on \(K\), and for only linear functions \(f\) clearly \(D_{\mathrm{Poin}}^{\mathrm{Lin}}(\mu) \le D_{\mathrm{Poin}}(\mu)\). \textit{V. G. Maz'ya} [Sov. Math., Dokl. 1, 882--885 (1960; Zbl 0114.31001); translation from Dokl. Akad. Nauk SSSR 133, 527--530 (1960)], \textit{J. Cheeger} [in: Probl. Analysis, Sympos. in Honor of Salomon Bochner, Princeton Univ. 1969, 195--199 (1970; Zbl 0212.44903)], \textit{P. Buser} [Ann. Sci. Éc. Norm. Supér. (4) 15, 213--230 (1982; Zbl 0501.53030)] and \textit{M. Ledoux} [Surv. Differ. Geom. 9, 219--240 (2004; Zbl 1061.58028)] showed that for all log-concave probability measures \(\mu\) on \(\mathbb{R}^n\), \[ \frac{1}{2} D_{\mathrm{Che}}(\mu) \le \frac{1}{D_{\mathrm{Poin}}(\mu)} \le C D_{\mathrm{Che}}(\mu), \] for some universal constant \(C > 1/2\); the same inequality also holds for the corresponding linear relaxations \(D_{\mathrm{Che}}^{\mathrm{Lin}}(\mu)\) and \(D_{\mathrm{Poin}}^{\mathrm{Lin}}(\mu)\). Consequently, the KLS conjecture may be equivalently reformulated as the following, \[ D_{\mathrm{Poin}}(\mu) \le C D_{\mathrm{Poin}}^{\mathrm{Lin}}(\mu), \] for some universal constant \(C>1\) and all log-concave probability measures \(\mu\) on \(\mathbb{R}^n\). Section 1 starts by explaining the KLS conjecture and previous known results, as well as a generalized Orlicz balls in \(\mathbb{R}^n\) (\(K_E = \{x\in \mathbb{R}^n: \sum_{i=1}^n V_i(x_i) \le E\}\) for \(n\) one-dimensional convex functions \(V_i: \mathbb{R}\to \mathbb{R}\)). Theorem 1.1 states the simplified main theorem for this paper, for \(\mu_i = \exp (- V_i (y))dy\) a log-concave probability measure and a random-variable \(X_i\) distributed according to \(\mu_i\) with \(E_V = 1 + \sum_{i=1}^n EV_i(X_i)\), then \(E=E_V\le n+1\) and \(C^{-n}\le \mathrm{Vol}(K_E) \le C^n\), \[ D_{\mathrm{Poin}}(\lambda_{K_E}) \le C \log (e + A^{(2)} \wedge n) D_{\mathrm{Poin}}^{\mathrm{Lin}}(\lambda_{K_E}). \] This confirms the KLS conjecture for the generalized Orlicz ball \(K_E\). Theorem 1.2 further shows that the KLS conjecture is true for \(K_E= \{x\in \mathbb{R}^n: V(x) \le E\}\) and \(A^{(2)} \wedge n =\sqrt{n} D_{\mathrm{Poin}}(\mu)\). Various examples are explicitly presented. Section 2 states results from Theorem 2.1 (main technical theorem) to Theorem 2.4 (main theorem) with technical supports from Corollary 2.2 and Proposition 2.3. Section 3 introduces concentration profiles and Barthe-Milman's result (Proposition 3.1) on related concentrations \({K}_1\) and \({K}_2\). For the concentration inequality for sums of independent random variables bounded from only one side (Hoeffding inequality for bounded random variables), Theorem 3.6 shows the one-sided Hoeffding inequality in the spirit of Bernstein's inequality. Section 4 gives the volume \(\mathrm{Vol}(K_E)\) bounding it first in Proposition 4.1. Part of Proposition 2.3 is proved in Proposition 4.6 as refinement of Fradelizi's bound which is sharp whenever \(\mu\) is log-affine on an appropriate convex cone, and is also proved in Proposition 4.9 for the barycenter and the covariance matrix of \(K_E\). Section 5 initiates the 1-Wasserstein distance of the induced probability measure on the boundary of a star-shaped body and a probability measure from the Borel function with indicator bound first, then an \(L^1\)-version of Hardy type inequality is proven by reducing various spectral-gap question from the star-shaped body to its boundary \(\partial \Omega\) in Lemma 5.4. Section 6 completes the proof of main technical theorem and to puts everything together. Subsection 6.1 first presents the proof of Theorem 2.1 by reducing the condition on \(D_{\mathrm{Poin}}(\nu)^2/D_{\mathrm{Poin}}^{\mathrm{Lin}}(\nu)^2 \in [1, 12]\) to have the KLS conjecture valid. By a well-known result of \textit{M. Gromov} and \textit{V. D. Milman} [Am. J. Math. 105, 843--854 (1983; Zbl 0522.53039)], a Poincaré inequality always implies the following exponential concentration: \(K_{\nu}(r) \le e^{- c_0 \frac{r}{D_{\mathrm{Poin}}(\nu)}}\). By applying Proposition 3.5 and Proposition 4.1, The concentration \(K_{\mu_{K_E}, w_0/\sqrt{n}}(r)\) has a exponential bound and the \(L^1\)-version of the Hardy type inequality estimates. It remains to invoke the following result, established in a more general weighted Riemannian setting, asserting the equivalence between concentration, spectral-gap and linear-isoperimetry under appropriate convexity assumptions. Theorem 2.1 follows. Subsection 6.2 presents a proof of Theorem 2.5, and the proof of Theorem 2.5 is identical to the one of Theorem 2.1 described in subsection 6.1, with the only difference being in the first step -- instead of invoking the \(L^p\) estimate given by Proposition 3.5 for transferring concentration from \(\mu\) to \(\mu_{K_E, w}\) and to invoke the \(L^{\infty}\) estimate of Lemma 3.4. Subsection 6.3 conclude the proofs of assertion (8) of Proposition 2.3 (by the well-known bath-tub principle), Theorem 2.4 (by Proposition 2.3) and Theorems 1.1 (The dimension-independent part immediately follows from an application of Theorem 2.4 for any \(E\in [E_{min}, E_{max}]\)) and Theorem 1.2 (by Theorem 2.5 to \(\mu = \mu_1 \otimes \mu_2 \cdots \otimes \mu_n\)). Subsection 6.4 gives a general formulation after rescaling, and states the main theorem (generalized version) in Corollary 6.3, and subsection 6.5 establishes the previous Examples 1.4 and 1.5. For convex bounded domains other than the generalized Orlicz balls or other than log-concave measures, the KLS conjecture remains open.
    0 references
    KLS conjecture
    0 references
    spectral-gap
    0 references
    convex bodies
    0 references
    generalized Orlicz balls
    0 references
    log-concave probability measure
    0 references
    Poincaré constant
    0 references
    Cheeger constant
    0 references
    concentration profile
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references

    Identifiers

    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references