Efficient Global Optimization of Two-Layer ReLU Networks: Quadratic-Time Algorithms and Adversarial Training
Publication:6171686
DOI10.1137/21M1467134arXiv2201.01965OpenAlexW4379232471MaRDI QIDQ6171686
Somayeh Sojoudi, Unnamed Author, Unnamed Author
Publication date: 14 August 2023
Published in: SIAM Journal on Mathematics of Data Science (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/2201.01965
Analysis of algorithms and problem complexity (68Q25) Numerical methods involving duality (49M29) Neural nets applied to problems in time-dependent statistical mechanics (82C32) Applications of functional analysis in optimization, convex analysis, mathematical programming, economics (46N10) Neural nets and related approaches to inference from stochastic processes (62M45)
Related Items (1)
Cites Work
- Unnamed Item
- Distributed Optimization and Statistical Learning via the Alternating Direction Method of Multipliers
- On the complexity analysis of randomized block-coordinate descent methods
- Approximate ADMM algorithms derived from Lagrangian splitting
- Multiplier and gradient methods
- CVXPY: A Python-Embedded Modeling Language for Convex Optimization
- Learning representations by back-propagating errors
- Breaking the Curse of Dimensionality with Convex Neural Networks
This page was built for publication: Efficient Global Optimization of Two-Layer ReLU Networks: Quadratic-Time Algorithms and Adversarial Training