Deep Neural Networks With Trainable Activations and Controlled Lipschitz Constant
From MaRDI portal
Publication:5103019
DOI10.1109/TSP.2020.3014611OpenAlexW3048660463MaRDI QIDQ5103019FDOQ5103019
Authors: Shayan Aziznejad, Harshit Gupta, Joaquim Campos, M. Unser
Publication date: 23 September 2022
Published in: IEEE Transactions on Signal Processing (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/2001.06263
Cited In (8)
- CLIP: cheap Lipschitz training of neural networks
- Sparsest piecewise-linear regression of one-dimensional data
- What Kinds of Functions Do Deep Neural Networks Learn? Insights from Variational Spline Theory
- Linear inverse problems with Hessian-Schatten total variation
- Approximation of Lipschitz Functions Using Deep Spline Neural Networks
- A survey on modern trainable activation functions
- Lipschitz Certificates for Layered Network Structures Driven by Averaged Activation Operators
- On Lipschitz Bounds of General Convolutional Neural Networks
This page was built for publication: Deep Neural Networks With Trainable Activations and Controlled Lipschitz Constant
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q5103019)