Newton Sketch: A Near Linear-Time Optimization Algorithm with Linear-Quadratic Convergence

From MaRDI portal
Publication:2967608

DOI10.1137/15M1021106zbMath1456.90125arXiv1505.02250OpenAlexW2963060476MaRDI QIDQ2967608

Martin J. Wainwright, Mert Pilanci

Publication date: 1 March 2017

Published in: SIAM Journal on Optimization (Search for Journal in Brave)

Full work available at URL: https://arxiv.org/abs/1505.02250



Related Items

Side-constrained minimum sum-of-squares clustering: mathematical programming and random projections, Functional principal subspace sampling for large scale functional data analysis, Randomized Spectral Clustering in Large-Scale Stochastic Block Models, A stochastic extra-step quasi-Newton method for nonsmooth nonconvex optimization, Randomized Newton's method for solving differential equations based on the neural network discretization, Sketch-based empirical natural gradient methods for deep learning, Unnamed Item, Sketched Newton--Raphson, Scalable subspace methods for derivative-free nonlinear least-squares optimization, Convergence analysis of a subsampled Levenberg-Marquardt algorithm, M-IHS: an accelerated randomized preconditioning method avoiding costly matrix decompositions, An overview of stochastic quasi-Newton methods for large-scale machine learning, Inexact restoration with subsampled trust-region methods for finite-sum minimization, On maximum residual nonlinear Kaczmarz-type algorithms for large nonlinear systems of equations, Generalized linear models for massive data via doubly-sketching, Unnamed Item, Randomized Quasi-Newton Updates Are Linearly Convergent Matrix Inversion Algorithms, Random projections for quadratic programs, Global optimization using random embeddings, Hessian averaging in stochastic Newton methods achieves superlinear convergence, Stochastic Reformulations of Linear Systems: Algorithms and Convergence Theory, Riemannian Natural Gradient Methods, Discriminative Bayesian filtering lends momentum to the stochastic Newton method for minimizing log-convex functions, An investigation of Newton-Sketch and subsampled Newton methods, Convergence of Newton-MR under Inexact Hessian Information, Reduced rank regression with matrix projections for high-dimensional multivariate linear regression model, Sub-sampled Newton methods, Optimization Methods for Large-Scale Machine Learning, A non-Euclidean gradient descent method with sketching for unconstrained matrix minimization, A Stochastic Semismooth Newton Method for Nonsmooth Nonconvex Optimization, On the local convergence of a stochastic semismooth Newton method for nonsmooth nonconvex optimization, Generalized self-concordant functions: a recipe for Newton-type methods, Unnamed Item, Unnamed Item, Unnamed Item, Unnamed Item, A hybrid stochastic optimization framework for composite nonconvex optimization


Uses Software


Cites Work