Factor-Adjusted Regularized Model Selection
From MaRDI portal
Computational methods for problems pertaining to statistics (62-08) Factor analysis and principal components; correspondence analysis (62H25) Linear regression; mixed models (62J05) Time series, auto-correlation, regression, etc. in statistics (GARCH) (62M10) Applications of statistics to economics (62P20) Ridge regression; shrinkage estimators (Lasso) (62J07)
Abstract: This paper studies model selection consistency for high dimensional sparse regression when data exhibits both cross-sectional and serial dependency. Most commonly-used model selection methods fail to consistently recover the true model when the covariates are highly correlated. Motivated by econometric studies, we consider the case where covariate dependence can be reduced through factor model, and propose a consistent strategy named Factor-Adjusted Regularized Model Selection (FarmSelect). By separating the latent factors from idiosyncratic components, we transform the problem from model selection with highly correlated covariates to that with weakly correlated variables. Model selection consistency as well as optimal rates of convergence are obtained under mild conditions. Numerical studies demonstrate the nice finite sample performance in terms of both model selection and out-of-sample prediction. Moreover, our method is flexible in a sense that it pays no price for weakly correlated and uncorrelated cases. Our method is applicable to a wide range of high dimensional sparse regression problems. An R-package FarmSelect is also provided for implementation.
Recommendations
- Model selection in factor-augmented regressions with estimated factors
- Model selection for generalized linear models with factor-augmented predictors
- Factor models and variable selection in high-dimensional regression analysis
- Variable selection by regularization methods for generalized mixed models
- Rank regularized estimation of approximate factor models
- Variable Selection in the Presence of Factors: A Model Selection Perspective
- Bayesian factor-adjusted sparse regression
- Regularization and variable selection in Heckman selection model
- Regularization and model selection with categorial effect modifiers
- Model selection in regression under structural constraints
Cites work
- scientific article; zbMATH DE number 5957408 (Why is no real title available?)
- scientific article; zbMATH DE number 4135256 (Why is no real title available?)
- scientific article; zbMATH DE number 3444596 (Why is no real title available?)
- scientific article; zbMATH DE number 845714 (Why is no real title available?)
- scientific article; zbMATH DE number 3396952 (Why is no real title available?)
- A Bernstein type inequality and moderate deviations for weakly dependent sequences
- ARMA model identification
- Asymptotics of empirical eigenstructure for high dimensional spiked covariance
- Contour projected dimension reduction
- Determining the Number of Factors in Approximate Factor Models
- Determining the Number of Factors in the General Dynamic Factor Model
- Distributed optimization and statistical learning via the alternating direction method of multipliers
- Eigenvalue ratio test for the number of factors
- Estimating the dimension of a model
- Factor modeling for high-dimensional time series: inference for the number of factors
- Factor models and variable selection in high-dimensional regression analysis
- Factor profiled sure independence screening
- Forecasting Using Principal Components From a Large Number of Predictors
- Forecasting the U.S. Unemployment Rate
- High dimensional ordinary least squares projection for screening variables
- High dimensional stochastic regression with latent factors, endogeneity and nonlinearity
- High-dimensional graphs and variable selection with the Lasso
- Inferential Theory for Factor Models of Large Dimensions
- Large covariance estimation by thresholding principal orthogonal complements. With discussion and authors' reply
- Least angle regression. (With discussion)
- Nearly unbiased variable selection under minimax concave penalty
- Nonconcave penalized likelihood with a diverging number of parameters.
- On consistency and sparsity for principal components analysis in high dimensions
- On model selection consistency of regularized M-estimators
- One-step sparse estimates in nonconcave penalized likelihood models
- Optimally sparse representation in general (nonorthogonal) dictionaries via ℓ 1 minimization
- Regularization and Variable Selection Via the Elastic Net
- Sharp Thresholds for High-Dimensional and Noisy Sparsity Recovery Using $\ell _{1}$-Constrained Quadratic Programming (Lasso)
- Simultaneous analysis of Lasso and Dantzig selector
- Sure independence screening for ultrahigh dimensional feature space. With discussion and authors' reply
- Sure independence screening in generalized linear models with NP-dimensionality
- The Dantzig selector: statistical estimation when \(p\) is much larger than \(n\). (With discussions and rejoinder).
- The Generalized Dynamic Factor Model
- The statistics and mathematics of high dimension low sample size asymptotics
- Use of canonical analysis in time series model identification
- Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties
Cited in
(29)- Mean tests for high-dimensional time series
- Factor Augmented Sparse Throughput Deep ReLU Neural Networks for High Dimensional Regression
- A generalized knockoff procedure for FDR control in structural change detection
- Learning Latent Factors From Diversified Projections and Its Applications to Over-Estimated and Weak Factors
- Are Latent Factor Regression and Sparse Regression Adequate?
- Comment on “A Tuning-Free Robust and Efficient Approach to High-Dimensional Regression”
- A Decorrelating and Debiasing Approach to Simultaneous Inference for High-Dimensional Confounded Models
- Factor models and variable selection in high-dimensional regression analysis
- Canonical thresholding for nonsparse high-dimensional linear regression
- Bayesian factor-adjusted sparse regression
- Do We Exploit all Information for Counterfactual Analysis? Benefits of Factor Models and Idiosyncratic Correction
- Efficient change point detection and estimation in high-dimensional correlation matrices
- F-test and z-test for high-dimensional regression models with a factor structure
- Model selection in factor-augmented regressions with estimated factors
- Semi-Standard Partial Covariance Variable Selection When Irrepresentable Conditions Fail
- scientific article; zbMATH DE number 7306913 (Why is no real title available?)
- Nonparametric estimation of the random coefficients model: an elastic net approach
- Bayesian MIDAS penalized regressions: estimation, selection, and prediction
- Model-Free Feature Screening and FDR Control With Knockoff Features
- FarmSelect
- Noisy matrix completion: understanding statistical guarantees for convex relaxation via nonconvex optimization
- Bridging factor and sparse models
- FNETS: Factor-Adjusted Network Estimation and Forecasting for High-Dimensional Time Series
- Model selection for generalized linear models with factor-augmented predictors
- Ridge Regression Under Dense Factor Augmented Models
- Factor Augmented Inverse Regression and its Application to Microbiome Data Analysis
- High-Dimensional Time Series Segmentation via Factor-Adjusted Vector Autoregressive Modeling
- Variable selection in high dimensional linear regressions with parameter instability
- Factor-adjusted tests for generalized linear models with multimodal data: an application to breast cancer data
This page was built for publication: Factor-Adjusted Regularized Model Selection
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q150847)