High-dimensional model recovery from random sketched data by exploring intrinsic sparsity
From MaRDI portal
Publication:782446
DOI10.1007/s10994-019-05865-4zbMath1472.68165OpenAlexW3000071498WikidataQ126399604 ScholiaQ126399604MaRDI QIDQ782446
Qihang Lin, Shenghuo Zhu, Rong Jin, Tianbao Yang, Li-jun Zhang
Publication date: 27 July 2020
Published in: Machine Learning (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s10994-019-05865-4
Ridge regression; shrinkage estimators (Lasso) (62J07) Classification and discrimination; cluster analysis (statistical aspects) (62H30) General nonlinear regression (62J02) Learning and adaptive systems in artificial intelligence (68T05)
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Finding structure with randomness: Probabilistic algorithms for constructing approximate matrix decompositions
- CUR matrix decompositions for improved data analysis
- Oracle inequalities in empirical risk minimization and sparse recovery problems. École d'Été de Probabilités de Saint-Flour XXXVIII-2008.
- Faster least squares approximation
- Learning to decode cognitive states from brain images
- ``Preconditioning for feature selection and regression in high-dimensional problems
- Database-friendly random projections: Johnson-Lindenstrauss with binary coins.
- Preconditioning the Lasso for sign consistency
- Simultaneous analysis of Lasso and Dantzig selector
- Image classification with the Fisher vector: theory and practice
- The Dantzig selector: statistical estimation when \(p\) is much larger than \(n\). (With discussions and rejoinder).
- Iterative Hessian sketch: Fast and accurate solution approximation for constrained least-squares
- One-Bit Compressed Sensing by Linear Programming
- A Proximal-Gradient Homotopy Method for the Sparse Least-Squares Problem
- Improved Matrix Algorithms via the Subsampled Randomized Hadamard Transform
- A sparse Johnson
- Approximate nearest neighbors and the fast Johnson-Lindenstrauss transform
- Randomized Sketches of Convex Programs With Sharp Guarantees
- Random Projections for Classification: A Recovery Approach
- IMPROVED ANALYSIS OF THE SUBSAMPLED RANDOMIZED HADAMARD TRANSFORM
- Randomized Algorithms for Matrices and Data
- Sparser Johnson-Lindenstrauss Transforms
- Extensions of Lipschitz mappings into a Hilbert space
- Sampling algorithms for l2 regression and applications
- Relative-Error $CUR$ Matrix Decompositions
- An elementary proof of a theorem of Johnson and Lindenstrauss
- Information-Theoretic Limits on Sparsity Recovery in the High-Dimensional and Noisy Setting
- The Fast Johnson–Lindenstrauss Transform and Approximate Nearest Neighbors
- A Proximal Stochastic Gradient Method with Progressive Variance Reduction
- Regularization and Variable Selection Via the Elastic Net
- Stochastic Dual Coordinate Ascent Methods for Regularized Loss Minimization
- Algorithmic Learning Theory
- Gene selection for cancer classification using support vector machines
- Accelerated proximal stochastic dual coordinate ascent for regularized loss minimization
This page was built for publication: High-dimensional model recovery from random sketched data by exploring intrinsic sparsity