Smooth sparse coding via marginal regression for learning sparse representations
From MaRDI portal
(Redirected from Publication:309913)
Abstract: We propose and analyze a novel framework for learning sparse representations, based on two statistical techniques: kernel smoothing and marginal regression. The proposed approach provides a flexible framework for incorporating feature similarity or temporal information present in data sets, via non-parametric kernel smoothing. We provide generalization bounds for dictionary learning using smooth sparse coding and show how the sample complexity depends on the L1 norm of kernel function used. Furthermore, we propose using marginal regression for obtaining sparse codes, which significantly improves the speed and allows one to scale to large dictionary sizes easily. We demonstrate the advantages of the proposed approach, both in terms of accuracy and speed by extensive experimentation on several real data sets. In addition, we demonstrate how the proposed approach could be used for improving semi-supervised sparse coding.
Recommendations
- Online learning for matrix factorization and sparse coding
- A max-margin dictionary learning algorithm for sparse representation
- Efficient dictionary learning with sparseness-enforcing projections
- Combining Reconstruction and Discrimination with Class-Specific Sparse Coding
- Proximal alternating method for dictionary learning
Cites work
- scientific article; zbMATH DE number 847282 (Why is no real title available?)
- scientific article; zbMATH DE number 3310599 (Why is no real title available?)
- A comparison of the Lasso and marginal regression
- Combinatorial methods in density estimation
- Convergence analysis of perturbed feasible descent methods
- Greed is Good: Algorithmic Results for Sparse Approximation
- Learning with Structured Sparsity
- Local Rademacher complexities
- Local Regression and Likelihood
- On the Goldstein-Levitin-Polyak gradient projection method
- Smoothing \(\ell_1\)-penalized estimators for high-dimensional time-course data
- Sparsity and Smoothness Via the Fused Lasso
- Sure independence screening for ultrahigh dimensional feature space. With discussion and authors' reply
- The sample complexity of dictionary learning
Cited in
(5)- Hierarchical regularization networks for sparsification based learning on noisy datasets
- Provably accurate double-sparse coding
- Feature selection and multi-kernel learning for sparse representation on a manifold
- Robust sparse coding via self-paced learning for data representation
- Learning sparsely used overcomplete dictionaries via alternating minimization
This page was built for publication: Smooth sparse coding via marginal regression for learning sparse representations
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q309913)