Probabilistic Lipschitz analysis of neural networks
DOI10.1007/978-3-030-65474-0_13zbMATH Open1474.68155OpenAlexW3119781287MaRDI QIDQ2233541FDOQ2233541
Authors: Ravi Mangal, Kartik Sarangmath, Aditya V. Nori, Alessandro Orso
Publication date: 18 October 2021
Full work available at URL: https://doi.org/10.1007/978-3-030-65474-0_13
Recommendations
Theory of programming languages (68N15) Mathematical aspects of software engineering (specification, verification, metrics, requirements, etc.) (68N30) Semantics in the theory of computing (68Q55) Networks and circuits as models of computation; circuit complexity (68Q06)
Cites Work
- A random polynomial-time algorithm for approximating the volume of convex bodies
- A geometric inequality and the complexity of computing volume
- On the Complexity of Computing the Volume of a Polyhedron
- An assertion-based program logic for probabilistic programs
- Proving the Correctness of Multiprocess Programs
- Computing the volume is difficult
- Simple relational correctness proofs for static analyses and program transformations
- Linear-invariant generation for probabilistic programs: automated support for proof-based methods
- Reluplex: an efficient SMT solver for verifying deep neural networks
- Synthesizing Probabilistic Invariants via Doob’s Decomposition
- Probabilistic abstract interpretation
- APLicative Programming with Naperian Functors
- Gaussian Cooling and $O^*(n^3)$ Algorithms for Volume and Gaussian Volume
- An array-oriented language with static rank polymorphism
Cited In (8)
- CLIP: cheap Lipschitz training of neural networks
- ABBA neural networks: coping with positivity, expressivity, and robustness
- Probabilistic robustness estimates for feed-forward neural networks
- \textsf{CLEVEREST}: accelerating CEGAR-based neural network verification via adversarial attacks
- Lipschitz Certificates for Layered Network Structures Driven by Averaged Activation Operators
- On Lipschitz Bounds of General Convolutional Neural Networks
- Robustness analysis of continuous-depth models with Lagrangian techniques
- LinSyn: synthesizing tight linear bounds for arbitrary neural network activation functions
This page was built for publication: Probabilistic Lipschitz analysis of neural networks
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2233541)