An efficient superpixel-based sparse representation framework for hyperspectral image classification
DOI10.1142/S0219691317500618zbMATH Open1386.68206OpenAlexW2754475250MaRDI QIDQ4595573FDOQ4595573
Authors: Sen Jia, Bin Deng, Qiang Huang
Publication date: 5 December 2017
Published in: International Journal of Wavelets, Multiresolution and Information Processing (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1142/s0219691317500618
Recommendations
- Spatial correlation constrained weighted conditional sparse representation for hyperspectral image classification
- Sparse representation based binary hypothesis model for hyperspectral image classification
- Locality-constrained sparse representation for hyperspectral image classification
- Spatial distribution preserving-based sparse subspace clustering for hyperspectral image
- A collaborative representation-based classifier coupling with \(K\)-nearest-neighbor dictionary for hyperspectral image classification
Pattern recognition, speech recognition (68T10) Computing methodologies for image processing (68U10)
Cites Work
Cited In (7)
- A collaborative representation-based classifier coupling with \(K\)-nearest-neighbor dictionary for hyperspectral image classification
- Superpixel based recursive least-squares method for lossless compression of hyperspectral images
- Sparse representation based binary hypothesis model for hyperspectral image classification
- Spatial correlation constrained weighted conditional sparse representation for hyperspectral image classification
- Locality-constrained sparse representation for hyperspectral image classification
- A Deep Sparse Representation with Random Dictionary for Hyperspectral Image Classification
- On curvelet CS reconstructed MR images and GA-based fuzzy conditional entropy maximization for segmentation
This page was built for publication: An efficient superpixel-based sparse representation framework for hyperspectral image classification
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q4595573)