AutoHOOT: Automatic High-Order Optimization for Tensors
From MaRDI portal
Publication:6340372
arXiv2005.04540MaRDI QIDQ6340372FDOQ6340372
Jiayu Ye, Linjian Ma, Edgar Solomonik
Publication date: 9 May 2020
Abstract: High-order optimization methods, including Newton's method and its variants as well as alternating minimization methods, dominate the optimization algorithms for tensor decompositions and tensor networks. These tensor methods are used for data analysis and simulation of quantum systems. In this work, we introduce AutoHOOT, the first automatic differentiation (AD) framework targeting at high-order optimization for tensor computations. AutoHOOT takes input tensor computation expressions and generates optimized derivative expressions. In particular, AutoHOOT contains a new explicit Jacobian / Hessian expression generation kernel whose outputs maintain the input tensors' granularity and are easy to optimize. The expressions are then optimized by both the traditional compiler optimization techniques and specific tensor algebra transformations. Experimental results show that AutoHOOT achieves competitive CPU and GPU performance for both tensor decomposition and tensor network applications compared to existing AD software and other tensor computation libraries with manually written kernels. The tensor methods generated by AutoHOOT are also well-parallelizable, and we demonstrate good scalability on a distributed memory supercomputer.
Has companion code repository: https://github.com/LinjianMa/AutoHOOT
This page was built for publication: AutoHOOT: Automatic High-Order Optimization for Tensors
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6340372)