Probabilistic Lipschitz analysis of neural networks
From MaRDI portal
Recommendations
Cites work
- A geometric inequality and the complexity of computing volume
- A random polynomial-time algorithm for approximating the volume of convex bodies
- APLicative Programming with Naperian Functors
- An array-oriented language with static rank polymorphism
- An assertion-based program logic for probabilistic programs
- Computing the volume is difficult
- Gaussian Cooling and $O^*(n^3)$ Algorithms for Volume and Gaussian Volume
- Linear-invariant generation for probabilistic programs: automated support for proof-based methods
- On the Complexity of Computing the Volume of a Polyhedron
- Probabilistic abstract interpretation
- Proving the Correctness of Multiprocess Programs
- Reluplex: an efficient SMT solver for verifying deep neural networks
- Simple relational correctness proofs for static analyses and program transformations
- Synthesizing Probabilistic Invariants via Doob’s Decomposition
Cited in
(9)- CLIP: cheap Lipschitz training of neural networks
- ABBA neural networks: coping with positivity, expressivity, and robustness
- Regularisation of neural networks by enforcing Lipschitz continuity
- Probabilistic robustness estimates for feed-forward neural networks
- \textsf{CLEVEREST}: accelerating CEGAR-based neural network verification via adversarial attacks
- Lipschitz Certificates for Layered Network Structures Driven by Averaged Activation Operators
- On Lipschitz Bounds of General Convolutional Neural Networks
- Robustness analysis of continuous-depth models with Lagrangian techniques
- LinSyn: synthesizing tight linear bounds for arbitrary neural network activation functions
This page was built for publication: Probabilistic Lipschitz analysis of neural networks
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2233541)