Lasso for sparse linear regression with exponentially -mixing errors
DOI10.1016/J.SPL.2017.01.023zbMATH Open1457.62216OpenAlexW2585836085MaRDI QIDQ2407765FDOQ2407765
Authors: Fang Xie, Lihu Xu, Youcai Yang
Publication date: 6 October 2017
Published in: Statistics \& Probability Letters (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/j.spl.2017.01.023
Recommendations
Linear regression; mixed models (62J05) Asymptotic distribution theory in statistics (62E20) Ridge regression; shrinkage estimators (Lasso) (62J07) Stationary stochastic processes (60G10)
Cites Work
- The Adaptive Lasso and Its Oracle Properties
- Least angle regression. (With discussion)
- Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties
- Title not available (Why is that?)
- Lasso-type recovery of sparse representations for high-dimensional data
- On the conditions used to prove oracle results for the Lasso
- Simultaneous analysis of Lasso and Dantzig selector
- Square-root lasso: pivotal recovery of sparse signals via conic programming
- A significance test for the lasso
- Regression coefficient and autoregressive order shrinkage and selection via the lasso
- Convergence rates in the strong law for bounded mixing sequences
- Model selection for vector autoregressive processes via adaptive lasso
Cited In (4)
- Lasso guarantees for \(\beta \)-mixing heavy-tailed time series
- Oracle inequality for sparse trace regression models with exponential \(\beta\)-mixing errors
- Lasso regression in sparse linear model with \(\varphi\)-mixing errors
- Square-root Lasso for high-dimensional sparse linear systems with weakly dependent errors
Uses Software
This page was built for publication: Lasso for sparse linear regression with exponentially \(\beta\)-mixing errors
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2407765)