Recovery of simultaneous low rank and two-way sparse coefficient matrices, a nonconvex approach
DOI10.1214/19-EJS1658zbMATH Open1434.90161arXiv1802.06967OpenAlexW3001343415MaRDI QIDQ2286374FDOQ2286374
Authors: Yanyan Li
Publication date: 22 January 2020
Published in: Electronic Journal of Statistics (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1802.06967
Recommendations
- Robust recovery of low-rank matrices with non-orthogonal sparse decomposition from incomplete measurements
- Riemannian thresholding methods for row-sparse and low-rank matrix recovery
- Convex relaxation algorithm for a structured simultaneous low-rank and sparse recovery problem
- Low-rank and sparse matrix recovery via inexact Newton-like method with non-monotone search
- Simultaneous pursuit of sparseness and rank structures for matrix decomposition
multi-task learningnonconvex optimizationgradient descent with hard thresholdinglow rank and two-way sparse coefficient matrixtwo-way sparse reduce rank regression
Cites Work
- A nonlinear programming algorithm for solving semidefinite programs via low-rank factorization
- Matrix completion from noisy entries
- Matrix completion and low-rank SVD via fast alternating least squares
- SOFAR: Large-Scale Association Network Learning
- ADMiRA: Atomic Decomposition for Minimum Rank Approximation
- Title not available (Why is that?)
- Sure Independence Screening for Ultrahigh Dimensional Feature Space
- Model Selection and Estimation in Regression with Grouped Variables
- A unified framework for high-dimensional analysis of \(M\)-estimators with decomposable regularizers
- Introductory lectures on convex optimization. A basic course.
- Robust principal component analysis?
- A Singular Value Thresholding Algorithm for Matrix Completion
- Modern Multivariate Statistical Techniques
- On consistency and sparsity for principal components analysis in high dimensions
- Truncated power method for sparse eigenvalue problems
- Exact matrix completion via convex optimization
- Sparse reduced-rank regression for simultaneous dimension reduction and variable selection
- Dimension Reduction and Coefficient Estimation in Multivariate Linear Regression
- Estimation of (near) low-rank matrices with noise and high-dimensional scaling
- Optimal selection of reduced rank estimators of high-dimensional matrices
- Reduced Rank Stochastic Regression with a Sparse Singular value Decomposition
- Nuclear-norm penalization and optimal rates for noisy low-rank matrix completion
- Joint variable and rank selection for parsimonious estimation of high-dimensional matrices
- Parallel stochastic gradient algorithms for large-scale matrix completion
- Sparse principal component analysis and iterative thresholding
- Guaranteed minimum-rank solutions of linear matrix equations via nuclear norm minimization
- The Power of Convex Relaxation: Near-Optimal Matrix Completion
- Restricted strong convexity and weighted matrix completion: optimal bounds with noise
- Estimation of high-dimensional low-rank matrices
- Sparse PCA: optimal rates and adaptive estimation
- Rank-Sparsity Incoherence for Matrix Decomposition
- Optimal detection of sparse principal components in high dimension
- Simultaneous inference for pairwise graphical models with generalized score matching
- Robust Matrix Decomposition With Sparse Corruptions
- Oracle inequalities and optimal inference under group sparsity
- The benefit of group sparsity
- Matrix Completion From a Few Entries
- Noisy matrix decomposition via convex relaxation: optimal rates in high dimensions
- Local minima and convergence in low-rank semidefinite programming
- ROP: matrix recovery via rank-one projections
- Title not available (Why is that?)
- Interior-point method for nuclear norm approximation with application to system identification
- Recovering Low-Rank Matrices From Few Coefficients in Any Basis
- A simpler approach to matrix completion
- Low-rank matrix completion using alternating minimization
- Reinforcement learning. An introduction
- Convex multi-task feature learning
- Multi-task learning for classification with Dirichlet process priors
- Support union recovery in high-dimensional multivariate regression
- Finite-time bounds for fitted value iteration
- Minimax sparse principal subspace estimation in high dimensions
- Title not available (Why is that?)
- Minimax bounds for sparse PCA with noisy high-dimensional data
- Biclustering via sparse singular value decomposition
- Incoherence-Optimal Matrix Completion
- High-dimensional analysis of semidefinite relaxations for sparse principal components
- Low-rank matrix recovery with composite optimization: good conditioning and rapid convergence
- Union support recovery in multi-task learning
- Guaranteed Matrix Completion via Non-Convex Factorization
- Finding low-rank solutions via nonconvex matrix factorization, efficiently and provably
- Singular vectors under random perturbation
- Between hard and soft thresholding: optimal iterative thresholding algorithms
- Moderate projection pursuit regression for multivariate response data
- Natural language processing (almost) from scratch
- Adaptive estimation in two-way sparse reduced-rank regression
- Orthogonal rank-one matrix pursuit for low rank matrix completion
- The learnability of quantum states
- Estimating differential latent variable graphical models with applications to brain connectivity
- Selective factor extraction in high dimensions
- The landscape of empirical risk for nonconvex losses
- Global Optimality in Low-Rank Matrix Optimization
- Gradient descent with non-convex constraints: local concavity determines convergence
- Scalable interpretable multi-response regression via SEED
- Recovering block-structured activations using compressive measurements
- On Cross-Validation for Sparse Reduced Rank Regression
- An equivalence between critical points for rank constraints versus low-rank factorizations
- Simultaneous pursuit of sparseness and rank structures for matrix decomposition
Cited In (13)
- Title not available (Why is that?)
- ISLET: fast and optimal low-rank tensor regression via importance sketching
- A Fast Majorize–Minimize Algorithm for the Recovery of Sparse and Low-Rank Matrices
- An optimal statistical and computational framework for generalized tensor estimation
- Exponential-Family Embedding With Application to Cell Developmental Trajectories for Single-Cell RNA-Seq Data
- A tight bound of modified iterative hard thresholding algorithm for compressed sensing.
- Estimation of a low-rank topic-based model for information cascades
- Simultaneous pursuit of sparseness and rank structures for matrix decomposition
- Greedy low-rank algorithm for spatial connectome regression
- Riemannian thresholding methods for row-sparse and low-rank matrix recovery
- Low-rank and sparse matrix recovery via inexact Newton-like method with non-monotone search
- Improved Estimation of High-dimensional Additive Models Using Subspace Learning
- Near-optimal estimation of simultaneously sparse and low-rank matrices from nested linear measurements
Uses Software
This page was built for publication: Recovery of simultaneous low rank and two-way sparse coefficient matrices, a nonconvex approach
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2286374)