Nonparametric independence screening in sparse ultra-high-dimensional additive models
From MaRDI portal
Publication:3095174
Abstract: A variable screening procedure via correlation learning was proposed Fan and Lv (2008) to reduce dimensionality in sparse ultra-high dimensional models. Even when the true model is linear, the marginal regression can be highly nonlinear. To address this issue, we further extend the correlation learning to marginal nonparametric learning. Our nonparametric independence screening is called NIS, a specific member of the sure independence screening. Several closely related variable screening procedures are proposed. Under the nonparametric additive models, it is shown that under some mild technical conditions, the proposed independence screening methods enjoy a sure screening property. The extent to which the dimensionality can be reduced by independence screening is also explicitly quantified. As a methodological extension, an iterative nonparametric independence screening (INIS) is also proposed to enhance the finite sample performance for fitting sparse additive models. The simulation results and a real data analysis demonstrate that the proposed procedure works well with moderate sample size and large dimension and performs better than competing methods.
Recommendations
- Nonparametric Independence Screening in Sparse Ultra-High-Dimensional Varying Coefficient Models
- Marginal empirical likelihood independence screening in sparse ultra-high dimensional additive models
- A sure independence screening procedure for ultra-high dimensional partially linear additive models
- Sure independence screening in generalized linear models with NP-dimensionality
- Robust conditional nonparametric independence screening for ultrahigh-dimensional data
Cited in
(only showing first 100 items - show all)- Model-free conditional screening via conditional distance correlation
- Surrogate-variable-based model-free feature screening for survival data under the general censoring mechanism
- Model averaging marginal regression for high dimensional conditional quantile prediction
- Feature screening and FDR control with knockoff features for ultrahigh-dimensional right-censored data
- Independence index sufficient variable screening for categorical responses
- Model-free forward screening via cumulative divergence
- Copula-based Partial Correlation Screening: a Joint and Robust Approach
- Cluster feature selection in high-dimensional linear models
- Learning sparse conditional distribution: an efficient kernel-based approach
- The backbone method for ultra-high dimensional sparse machine learning
- Robust feature screening for elliptical copula regression model
- Conditional characteristic feature screening for massive imbalanced data
- Principal varying coefficient estimator for high-dimensional models
- L2RM: Low-Rank Linear Regression Models for High-Dimensional Matrix Responses
- Semiparametric model average prediction in panel data analysis
- Asset selection based on high frequency Sharpe ratio
- Ultrahigh dimensional feature screening for additive model with multivariate response
- Model-free feature screening via a modified composite quantile correlation
- Feature selection for generalized varying coefficient mixed-effect models with application to obesity GWAS
- Robust composite weighted quantile screening for ultrahigh dimensional discriminant analysis
- Conditional SIRS for nonparametric and semiparametric models by marginal empirical likelihood
- Fast feature selection via streamwise procedure for massive data
- An iterative approach to distance correlation-based sure independence screening
- Nonparametric variable screening for multivariate additive models
- RCV-based error density estimation in the ultrahigh dimensional additive model
- A general framework for tensor screening through smoothing
- Uniform joint screening for ultra-high dimensional graphical models
- On sufficient variable screening using log odds ratio filter
- Feature screening for network autoregression model
- Remodeling and estimation for sparse partially linear regression models
- Tests for \(p\)-regression coefficients in linear panel model when \(p\) is divergent
- Asymptotic properties of high-dimensional random forests
- A robust variable screening method for high-dimensional data
- Dynamic tilted current correlation for high dimensional variable screening
- Model-Free Feature Screening and FDR Control With Knockoff Features
- Group screening for ultra-high-dimensional feature under linear model
- Sufficient variable selection using independence measures for continuous response
- Composite coefficient of determination and its application in ultrahigh dimensional variable screening
- Sparse Composite Quantile Regression with Ultra-high Dimensional Heterogeneous Data
- Feature screening under missing indicator imputation with non-ignorable missing response
- Interaction identification and clique screening for classification with ultra-high dimensional discrete features
- A nonparametric procedure for linear and nonlinear variable screening
- The Kendall interaction filter for variable interaction screening in high dimensional classification problems
- Sure independence screening in the presence of missing data
- Variance ratio screening for ultrahigh dimensional discriminant analysis
- A modified mean-variance feature-screening procedure for ultrahigh-dimensional discriminant analysis
- Nonparametric independence screening for ultra-high-dimensional longitudinal data under additive models
- Fused variable screening for massive imbalanced data
- A nonparametric feature screening method for ultrahigh-dimensional missing response
- A note on quantile feature screening via distance correlation
- Variable selection in finite mixture of semi-parametric regression models
- Gini correlation for feature screening
- Grouped feature screening for ultra-high dimensional data for the classification model
- Feature screening in ultrahigh-dimensional varying-coefficient Cox model
- Sure independence screening in ultrahigh dimensional generalized additive models
- The fused Kolmogorov filter: a nonparametric model-free screening method
- Quantile-adaptive model-free variable screening for high-dimensional heterogeneous data
- Marginal empirical likelihood and sure independence feature screening
- Martingale difference correlation and its use in high-dimensional variable screening
- A flexible semiparametric forecasting model for time series
- Semiparametric model averaging prediction for dichotomous response
- Robust feature screening procedures for single and mixed types of data
- Model-free sure screening via maximum correlation
- Nonparametric variable selection and its application to additive models
- Semiparametric Bayesian information criterion for model selection in ultra-high dimensional additive models
- Estimation and model selection in generalized additive partial linear models for correlated data with diverging number of covariates
- Goodness-of-fit testing-based selection for large-\(p\)-small-\(n\) problems: a two-stage ranking approach
- Asymptotic normality of DHD estimators in a partially linear model
- Simultaneous variable selection and estimation in semiparametric modeling of longitudinal/clustered data
- Semiparametric regression models with additive nonparametric components and high dimensional parametric components
- A sure independence screening procedure for ultra-high dimensional partially linear additive models
- Focused information criterion and model averaging for generalized additive partial linear models
- Local independence feature screening for nonparametric and semiparametric models by marginal empirical likelihood
- Neuronized Priors for Bayesian Sparse Linear Regression
- Robust rank correlation based screening
- Sparse Single Index Models for Multivariate Responses
- Model selection for high-dimensional quadratic regression via regularization
- Component selection in additive quantile regression models
- Fast Bayesian variable selection for high dimensional linear models: marginal solo spike and slab priors
- Ranking-based variable selection for high-dimensional data
- Sequential Lasso cum EBIC for feature selection with ultra-high dimensional feature space
- High dimensional single index models
- Global sensitivity analysis with dependence measures
- Asymptotics for penalised splines in generalised additive models
- Feature screening for time-varying coefficient models with ultrahigh-dimensional longitudinal data
- Parametric and semiparametric reduced-rank regression with flexible sparsity
- Greedy forward regression for variable screening
- Sparse model identification and learning for ultra-high-dimensional additive partially linear models
- Penalized profiled semiparametric estimating functions
- Spline estimator for simultaneous variable selection and constant coefficient identification in high-dimensional generalized varying-coefficient models
- Polynomial spline estimation for generalized varying coefficient partially linear models with a diverging number of components
- Feature screening via distance correlation learning
- A projection-based conditional dependence measure with applications to high-dimensional undirected graphical models
- Model-free conditional independence feature screening for ultrahigh dimensional data
- Feature selection for varying coefficient models with ultrahigh-dimensional covariates
- Sure independence screening in generalized linear models with NP-dimensionality
- Variable selection for high-dimensional varying coefficient partially linear models via nonconcave penalty
- Model selection and structure specification in ultra-high dimensional generalised semi-varying coefficient models
- Impacts of high dimensionality in finite samples
- Consistent Screening Procedures in High-dimensional Binary Classification
This page was built for publication: Nonparametric independence screening in sparse ultra-high-dimensional additive models
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q3095174)