Taming numerical imprecision by adapting the KL divergence to negative probabilities
From MaRDI portal
Publication:6606973
DOI10.1007/S11222-024-10480-YzbMATH Open1545.62115MaRDI QIDQ6606973FDOQ6606973
Maren Klever, Rudolf Schill, Simon Pfahler, Peter Georg, Rainer Spang, L. Grasedyck, T. Wettig
Publication date: 17 September 2024
Published in: Statistics and Computing (Search for Journal in Brave)
approximate Bayesian computationKullback-Leibler divergencestatistical optimizationtensor trainsmutual hazard networks
Cites Work
- Title not available (Why is that?)
- Title not available (Why is that?)
- On Information and Sufficiency
- Tensor-Train Decomposition
- On Tensors, Sparsity, and Nonnegative Factorizations
- Tensor Spaces and Numerical Tensor Calculus
- A literature survey of low-rank tensor approximation techniques
- Alternating Minimal Energy Methods for Linear Systems in Higher Dimensions
- Rényi Divergence and Kullback-Leibler Divergence
- The infrared behaviour of QCD Green's functions. Confinement, dynamical symmetry breaking, and hadrons as relativistic bound states
- The alternating linear scheme for tensor optimization in the tensor train format
- Numerical Methods in Markov Chain Modeling
- Positive tensor factorization
- Information geometry and its applications
- Newton-based optimization for Kullback–Leibler nonnegative tensor factorizations
- Low-rank tensor methods for Markov chains with applications to tumor progression models
This page was built for publication: Taming numerical imprecision by adapting the KL divergence to negative probabilities
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6606973)