Lipschitz properties for deep convolutional networks
From MaRDI portal
Publication:4686249
Abstract: In this paper we discuss the stability properties of convolutional neural networks. Convolutional neural networks are widely used in machine learning. In classification they are mainly used as feature extractors. Ideally, we expect similar features when the inputs are from the same class. That is, we hope to see a small change in the feature vector with respect to a deformation on the input signal. This can be established mathematically, and the key step is to derive the Lipschitz properties. Further, we establish that the stability results can be extended for more general networks. We give a formula for computing the Lipschitz bound, and compare it with other methods to show it is closer to the optimal value.
Recommendations
- CLIP: cheap Lipschitz training of neural networks
- Group invariance, stability to deformations, and complexity of deep convolutional representations
- Universality of deep convolutional neural networks
- Approximation properties of deep ReLU CNNs
- Stability of Deep Neural Networks via Discrete Rough Paths
Cites work
Cited in
(18)- Lipschitz Certificates for Layered Network Structures Driven by Averaged Activation Operators
- Gabor neural networks with proven approximation properties
- Probabilistic Lipschitz analysis of neural networks
- Stability for the training of deep neural networks and other classifiers
- Designing stable neural networks using convex analysis and ODEs
- Stability of iterated dyadic filter banks
- A noise-based stabilizer for convolutional neural networks
- CLIP: cheap Lipschitz training of neural networks
- Asymptotic analysis of higher-order scattering transform of Gaussian processes
- Regularisation of neural networks by enforcing Lipschitz continuity
- Stochastic Markov gradient descent and training low-bit neural networks
- A Note on the Regularity of Images Generated by Convolutional Neural Networks
- On Lipschitz Bounds of General Convolutional Neural Networks
- Stability of the scattering transform for deformations with minimal regularity
- Multilinear compressive sensing and an application to convolutional linear networks
- On the convergence of formally diverging neural net-based classifiers
- Group invariance, stability to deformations, and complexity of deep convolutional representations
- Rendition: reclaiming what a black box takes away
This page was built for publication: Lipschitz properties for deep convolutional networks
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q4686249)