Background subtraction with Kronecker-basis-representation based tensor sparsity and \(l_{1,1,2}\) norm
From MaRDI portal
Publication:2658562
DOI10.1007/s11045-020-00729-wzbMath1458.94022OpenAlexW3035050919MaRDI QIDQ2658562
Lixia Chen, Xue-Wen Wang, Junli Liu
Publication date: 23 March 2021
Published in: Multidimensional Systems and Signal Processing (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s11045-020-00729-w
background subtractiontensor robust principal component analysis\(l_{1,1,2}\) normalternating direction multiplier method
Image processing (compression, reconstruction, etc.) in information and communication theory (94A08)
Uses Software
Cites Work
- Decomposition into low-rank plus additive matrices for background/foreground separation: a review for a comparative evaluation with a large-scale dataset
- Enhancing sparsity by reweighted \(\ell _{1}\) minimization
- Global convergence of ADMM in nonconvex nonsmooth optimization
- Robust Low-Rank Tensor Recovery: Models and Algorithms
- Robust principal component analysis?
- Total Variation Regularized Tensor RPCA for Background Subtraction From Compressive Measurements
- Moving Object Detection in Complex Scene Using Spatiotemporal Structured-Sparse RPCA
- On the Complexity of Robust PCA and ℓ1-Norm Low-Rank Matrix Approximation
- Model Selection and Estimation in Regression with Grouped Variables
- The direct extension of ADMM for multi-block convex minimization problems is not necessarily convergent
This page was built for publication: Background subtraction with Kronecker-basis-representation based tensor sparsity and \(l_{1,1,2}\) norm