Mathematical Research Data Initiative
Main page
Recent changes
Random page
SPARQL
MaRDI@GitHub
New item
Special pages
In other projects
MaRDI portal item
Discussion
View source
View history
English
Log in

Deep Neural Networks With Trainable Activations and Controlled Lipschitz Constant

From MaRDI portal
Publication:5103019
Jump to:navigation, search

DOI10.1109/TSP.2020.3014611OpenAlexW3048660463MaRDI QIDQ5103019FDOQ5103019


Authors: Shayan Aziznejad, Harshit Gupta, Joaquim Campos, M. Unser Edit this on Wikidata


Publication date: 23 September 2022

Published in: IEEE Transactions on Signal Processing (Search for Journal in Brave)

Full work available at URL: https://arxiv.org/abs/2001.06263





Mathematics Subject Classification ID

Signal theory (characterization, reconstruction, filtering, etc.) (94A12)



Cited In (8)

  • CLIP: cheap Lipschitz training of neural networks
  • Sparsest piecewise-linear regression of one-dimensional data
  • What Kinds of Functions Do Deep Neural Networks Learn? Insights from Variational Spline Theory
  • Linear inverse problems with Hessian-Schatten total variation
  • Approximation of Lipschitz Functions Using Deep Spline Neural Networks
  • A survey on modern trainable activation functions
  • Lipschitz Certificates for Layered Network Structures Driven by Averaged Activation Operators
  • On Lipschitz Bounds of General Convolutional Neural Networks





This page was built for publication: Deep Neural Networks With Trainable Activations and Controlled Lipschitz Constant

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q5103019)

Retrieved from "https://portal.mardi4nfdi.de/w/index.php?title=Publication:5103019&oldid=19622329"
Tools
What links here
Related changes
Printable version
Permanent link
Page information
This page was last edited on 8 February 2024, at 13:19. Warning: Page may not contain recent updates.
Privacy policy
About MaRDI portal
Disclaimers
Imprint
Powered by MediaWiki