An SMT-based approach for verifying binarized neural networks
From MaRDI portal
Publication:2233508
Abstract: Deep learning has emerged as an effective approach for creating modern software systems, with neural networks often surpassing hand-crafted systems. Unfortunately, neural networks are known to suffer from various safety and security issues. Formal verification is a promising avenue for tackling this difficulty, by formally certifying that networks are correct. We propose an SMT-based technique for verifying Binarized Neural Networks - a popular kind of neural network, where some weights have been binarized in order to render the neural network more memory and energy efficient, and quicker to evaluate. One novelty of our technique is that it allows the verification of neural networks that include both binarized and non-binarized components. Neural network verification is computationally very difficult, and so we propose here various optimizations, integrated into our SMT procedure as deduction steps, as well as an approach for parallelizing verification queries. We implement our technique as an extension to the Marabou framework, and use it to evaluate the approach on popular binarized neural network architectures.
Recommendations
- Verifying binarized neural networks by Angluin-style learning
- \textsf{BDD4BNN}: a BDD-based quantitative analysis framework for binarized neural networks
- Reluplex: an efficient SMT solver for verifying deep neural networks
- Reluplex: a calculus for reasoning about deep neural networks
- Formal verification of piece-wise linear feed-forward neural networks
Cites work
- scientific article; zbMATH DE number 3177183 (Why is no real title available?)
- scientific article; zbMATH DE number 6982943 (Why is no real title available?)
- An abstraction-based framework for neural network verification
- Deep learning
- DeepSafe: a data-driven approach for assessing robustness of neural networks
- Formal verification of piece-wise linear feed-forward neural networks
- How many bits does it take to quantize your neural network?
- Reluplex: an efficient SMT solver for verifying deep neural networks
- Safety verification of deep neural networks
- Satisfiability modulo theories
- Verification of deep convolutional neural networks using ImageStars
Cited in
(15)- Verifying the generalization of deep learning to out-of-distribution domains
- \textsf{BDD4BNN}: a BDD-based quantitative analysis framework for binarized neural networks
- Reluplex: an efficient SMT solver for verifying deep neural networks
- Verifying Neural Networks Against Backdoor Attacks
- Verifying binarized neural networks by Angluin-style learning
- Formal verification for quantized neural networks
- On neural network equivalence checking using SMT solvers
- \textsf{QEBVerif}: quantization error bound verification of neural networks
- Verifying generalization in deep learning
- Task-aware verifiable RNN-based policies for partially observable Markov decision processes
- \textsf{CLEVEREST}: accelerating CEGAR-based neural network verification via adversarial attacks
- SMT-based modeling and verification of spiking neural networks: a case study
- Towards formal XAI: formally approximate minimal explanations of neural networks
- Challenging SMT solvers to verify neural networks
- \textsc{OccRob}: efficient SMT-based occlusion robustness verification of deep neural networks
This page was built for publication: An SMT-based approach for verifying binarized neural networks
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2233508)