Recovery of simultaneous low rank and two-way sparse coefficient matrices, a nonconvex approach
From MaRDI portal
Publication:2286374
DOI10.1214/19-EJS1658zbMath1434.90161arXiv1802.06967OpenAlexW3001343415MaRDI QIDQ2286374
Publication date: 22 January 2020
Published in: Electronic Journal of Statistics (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1802.06967
nonconvex optimizationmulti-task learninggradient descent with hard thresholdinglow rank and two-way sparse coefficient matrixtwo-way sparse reduce rank regression
Related Items
An optimal statistical and computational framework for generalized tensor estimation ⋮ Improved Estimation of High-dimensional Additive Models Using Subspace Learning ⋮ Unnamed Item ⋮ A tight bound of modified iterative hard thresholding algorithm for compressed sensing. ⋮ Exponential-Family Embedding With Application to Cell Developmental Trajectories for Single-Cell RNA-Seq Data ⋮ ISLET: Fast and Optimal Low-Rank Tensor Regression via Importance Sketching
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Matrix Completion and Low-Rank SVD via Fast Alternating Least Squares
- Sparse principal component analysis and iterative thresholding
- Minimax bounds for sparse PCA with noisy high-dimensional data
- Optimal detection of sparse principal components in high dimension
- Noisy matrix decomposition via convex relaxation: optimal rates in high dimensions
- Estimation of high-dimensional low-rank matrices
- Estimation of (near) low-rank matrices with noise and high-dimensional scaling
- Optimal selection of reduced rank estimators of high-dimensional matrices
- Oracle inequalities and optimal inference under group sparsity
- Nuclear-norm penalization and optimal rates for noisy low-rank matrix completion
- Joint variable and rank selection for parsimonious estimation of high-dimensional matrices
- High-dimensional analysis of semidefinite relaxations for sparse principal components
- The benefit of group sparsity
- Convex multi-task feature learning
- Moderate projection pursuit regression for multivariate response data
- A nonlinear programming algorithm for solving semidefinite programs via low-rank factorization
- Introductory lectures on convex optimization. A basic course.
- The landscape of empirical risk for nonconvex losses
- Low-rank matrix recovery with composite optimization: good conditioning and rapid convergence
- ROP: matrix recovery via rank-one projections
- Recovering block-structured activations using compressive measurements
- Parallel stochastic gradient algorithms for large-scale matrix completion
- Support union recovery in high-dimensional multivariate regression
- Minimax sparse principal subspace estimation in high dimensions
- Sparse PCA: optimal rates and adaptive estimation
- Local minima and convergence in low-rank semidefinite programming
- Exact matrix completion via convex optimization
- Guaranteed Matrix Completion via Non-Convex Factorization
- Incoherence-Optimal Matrix Completion
- Robust principal component analysis?
- A Singular Value Thresholding Algorithm for Matrix Completion
- Biclustering via Sparse Singular Value Decomposition
- Rank-Sparsity Incoherence for Matrix Decomposition
- On Cross-Validation for Sparse Reduced Rank Regression
- Modern Multivariate Statistical Techniques
- The learnability of quantum states
- Interior-Point Method for Nuclear Norm Approximation with Application to System Identification
- Guaranteed Minimum-Rank Solutions of Linear Matrix Equations via Nuclear Norm Minimization
- Global Optimality in Low-Rank Matrix Optimization
- Sure Independence Screening for Ultrahigh Dimensional Feature Space
- Sparse Reduced-Rank Regression for Simultaneous Dimension Reduction and Variable Selection
- Adaptive Estimation in Two-way Sparse Reduced-rank Regression
- Between hard and soft thresholding: optimal iterative thresholding algorithms
- Estimating differential latent variable graphical models with applications to brain connectivity
- Dimension Reduction and Coefficient Estimation in Multivariate Linear Regression
- An Equivalence between Critical Points for Rank Constraints Versus Low-Rank Factorizations
- SOFAR: Large-Scale Association Network Learning
- Finding Low-Rank Solutions via Nonconvex Matrix Factorization, Efficiently and Provably
- Gradient descent with non-convex constraints: local concavity determines convergence
- Orthogonal Rank-One Matrix Pursuit for Low Rank Matrix Completion
- On Consistency and Sparsity for Principal Components Analysis in High Dimensions
- Robust Matrix Decomposition With Sparse Corruptions
- Recovering Low-Rank Matrices From Few Coefficients in Any Basis
- ADMiRA: Atomic Decomposition for Minimum Rank Approximation
- Matrix Completion From a Few Entries
- The Power of Convex Relaxation: Near-Optimal Matrix Completion
- Selective factor extraction in high dimensions
- Singular vectors under random perturbation
- A Simpler Approach to Matrix Completion
- Restricted strong convexity and weighted matrix completion: Optimal bounds with noise
- Model Selection and Estimation in Regression with Grouped Variables
- Low-rank matrix completion using alternating minimization
- Reduced Rank Stochastic Regression with a Sparse Singular value Decomposition
- A unified framework for high-dimensional analysis of \(M\)-estimators with decomposable regularizers