Online optimization for max-norm regularization
From MaRDI portal
Publication:2014582
DOI10.1007/S10994-017-5628-6zbMATH Open1454.68132arXiv1406.3190OpenAlexW2153439876MaRDI QIDQ2014582FDOQ2014582
Publication date: 25 August 2017
Published in: Machine Learning (Search for Journal in Brave)
Abstract: Max-norm regularizer has been extensively studied in the last decade as it promotes an effective low-rank estimation for the underlying data. However, such max-norm regularized problems are typically formulated and solved in a batch manner, which prevents it from processing big data due to possible memory budget. In this paper, hence, we propose an online algorithm that is scalable to large-scale setting. Particularly, we consider the matrix decomposition problem as an example, although a simple variant of the algorithm and analysis can be adapted to other important problems such as matrix completion. The crucial technique in our implementation is to reformulating the max-norm to an equivalent matrix factorization form, where the factors consist of a (possibly overcomplete) basis component and a coefficients one. In this way, we may maintain the basis component in the memory and optimize over it and the coefficients for each sample alternatively. Since the memory footprint of the basis component is independent of the sample size, our algorithm is appealing when manipulating a large collection of samples. We prove that the sequence of the solutions (i.e., the basis component) produced by our algorithm converges to a stationary point of the expected loss function asymptotically. Numerical study demonstrates encouraging results for the efficacy and robustness of our algorithm compared to the widely used nuclear norm solvers.
Full work available at URL: https://arxiv.org/abs/1406.3190
Recommendations
Factor analysis and principal components; correspondence analysis (62H25) Learning and adaptive systems in artificial intelligence (68T05) Online algorithms; streaming algorithms (68W27) Factorization of matrices (15A23)
Cites Work
- Principal component analysis.
- Asymptotic Statistics
- Robust principal component analysis?
- A Singular Value Thresholding Algorithm for Matrix Completion
- 1-Bit matrix completion
- Noisy low-rank matrix completion with general sampling distribution
- Exact matrix completion via convex optimization
- Guaranteed Minimum-Rank Solutions of Linear Matrix Equations via Nuclear Norm Minimization
- A Max-Norm Constrained Minimization Approach to 1-Bit Matrix Completion
- De-noising by soft-thresholding
- Title not available (Why is that?)
- Online learning for matrix factorization and sparse coding
- Robust PCA via Outlier Pursuit
- Local minima and convergence in low-rank semidefinite programming
- Matrix completion via max-norm constrained optimization
- Title not available (Why is that?)
- Learning Theory
- Optimization Problems with Perturbations: A Guided Tour
- Clustering, Hamming Embedding, Generalized LSH and the Max Norm
- Outlier-Robust PCA: The High-Dimensional Case
Cited In (7)
- Title not available (Why is that?)
- The relaxed online maximum margin algorithm
- Max-norm optimization for robust matrix recovery
- Online BP functions maximization
- Online manifold regularization by dual ascending procedure
- Online Schatten quasi-norm minimization for robust principal component analysis
- A Stochastic Majorize-Minimize Subspace Algorithm for Online Penalized Least Squares Estimation
This page was built for publication: Online optimization for max-norm regularization
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2014582)