Sure independence screening for ultrahigh dimensional feature space. With discussion and authors' reply
DOI10.1111/J.1467-9868.2008.00674.XzbMATH Open1411.62187arXivmath/0612857OpenAlexW2154560360WikidataQ42087328 ScholiaQ42087328MaRDI QIDQ4632602FDOQ4632602
Authors: Jinchi Lv, Jianqing Fan
Publication date: 30 April 2019
Published in: Journal of the Royal Statistical Society Series B: Statistical Methodology (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/math/0612857
Recommendations
- Sure independence screening in generalized linear models with NP-dimensionality
- Ultrahigh dimensional feature selection: beyond the linear model
- Factor profiled sure independence screening
- High-dimensional variable selection
- The Dantzig selector: statistical estimation when \(p\) is much larger than \(n\). (With discussions and rejoinder).
Lassovariable selectiondimensionality reductionadaptive Lassosure independence screeningsure screeningDantzig selectororacle estimatorsmoothly clipped absolute deviation
Point estimation (62F10) Linear regression; mixed models (62J05) Ridge regression; shrinkage estimators (Lasso) (62J07) Research exposition (monographs, survey articles) pertaining to statistics (62-02)
Cites Work
- A decision-theoretic generalization of on-line learning and an application to boosting
- Heuristics of instability and stabilization in model selection
- The Adaptive Lasso and Its Oracle Properties
- Least angle regression. (With discussion)
- Weak convergence and empirical processes. With applications to statistics
- Pathwise coordinate optimization
- Ideal spatial adaptation by wavelet shrinkage
- Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties
- Title not available (Why is that?)
- Statistics on special manifolds
- Simultaneous analysis of Lasso and Dantzig selector
- The Dantzig selector: statistical estimation when \(p\) is much larger than \(n\). (With discussions and rejoinder).
- High-dimensional graphs and variable selection with the Lasso
- Title not available (Why is that?)
- High-dimensional classification using features annealed independence rules
- Sparsistency and rates of convergence in large covariance matrix estimation
- Persistene in high-dimensional linear predictor-selection and the virtue of overparametrization
- Some theory for Fisher's linear discriminant function, `naive Bayes', and some alternatives when there are many more variables than observations
- Asymptotics for Lasso-type estimators.
- On the distribution of the largest eigenvalue in principal components analysis
- Variable selection for Cox's proportional hazards model and frailty model
- Regularized estimation of large covariance matrices
- Variable selection using MM algorithms
- The Group Lasso for Logistic Regression
- Title not available (Why is that?)
- Title not available (Why is that?)
- Optimally sparse representation in general (nonorthogonal) dictionaries via ℓ 1 minimization
- Statistical significance for genomewide studies
- The sparsity and bias of the LASSO selection in high-dimensional linear regression
- Relaxed Lasso
- Nonconcave penalized likelihood with a diverging number of parameters.
- Better Subset Regression Using the Nonnegative Garrote
- Title not available (Why is that?)
- Regularization of Wavelet Approximations
- Local Strong Homogeneity of a Regularized Estimator
- A Statistical View of Some Chemometrics Regression Tools
- Statistical challenges with high dimensionality: feature selection in knowledge discovery
- Rejoinder: One-step sparse estimates in nonconcave penalized likelihood models
- Asymptotic properties of bridge estimators in sparse high-dimensional regression models
- Approximation and learning by greedy algorithms
- The concentration of measure phenomenon
- Limit of the smallest eigenvalue of a large dimensional sample covariance matrix
- Uncertainty principles and ideal atomic decomposition
- Geometric Representation of High Dimension, Low Sample Size Data
- Title not available (Why is that?)
- Best subset selection, persistence in high-dimensional statistical learning and optimization under \(l_1\) constraint
- ``Preconditioning for feature selection and regression in high-dimensional problems
- A limit theorem for the norm of random matrices
- Comments on: ``Wavelets in statistics: a review by A. Antoniadis
- The smallest eigenvalue of a large dimensional Wishart matrix
- Deviation Inequalities on Largest Eigenvalues
Cited In (only showing first 100 items - show all)
- Variable screening for ultrahigh dimensional censored quantile regression
- Feature screening for ultrahigh-dimensional censored data with varying coefficient single-index model
- A model-free conditional screening approach via sufficient dimension reduction
- Asymptotics of AIC, BIC and \(C_p\) model selection rules in high-dimensional regression
- Fast robust feature screening for ultrahigh-dimensional varying coefficient models
- Adaptive hybrid screening for efficient lasso optimization
- A selective overview of sparse sufficient dimension reduction
- Network-based feature screening with applications to genome data
- Individual-level social influence identification in social media: a learning-simulation coordinated method
- Robust dependence measure for detecting associations in large data set
- Model-free slice screening for ultrahigh-dimensional survival data
- On correlation rank screening for ultra-high dimensional competing risks data
- Sequential profile Lasso for ultra-high-dimensional partially linear models
- Inference for biased models: a quasi-instrumental variable approach
- A sequential approach to feature selection in high-dimensional additive models
- Title not available (Why is that?)
- Forward selection for feature screening and structure identification in varying coefficient models
- Conditional feature screening for mean and variance functions in models with multiple-index structure
- Censored cumulative residual independent screening for ultrahigh-dimensional survival data
- Conditional quantile correlation screening procedure for ultrahigh-dimensional varying coefficient models
- Model‐free conditional screening for ultrahigh‐dimensional survival data via conditional distance correlation
- Model-Free Conditional Feature Screening with FDR Control
- Robust conditional nonparametric independence screening for ultrahigh-dimensional data
- Feature screening for nonparametric and semiparametric models with ultrahigh-dimensional covariates
- Screening group variables in the proportional hazards model
- Conditional screening for ultra-high dimensional covariates with survival outcomes
- Unified mean-variance feature screening for ultrahigh-dimensional regression
- High-dimensional variable screening through kernel-based conditional mean dependence
- Portfolio optimization with ambiguous correlation and stochastic volatilities
- Canonical kernel dimension reduction
- Principal components adjusted variable screening
- Ultrahigh dimensional feature screening via projection
- Two-layer EM algorithm for ALD mixture regression models: a new solution to composite quantile regression
- A new nonparametric screening method for ultrahigh-dimensional survival data
- Sparse pathway-based prediction models for high-throughput molecular data
- A new robust model-free feature screening method for ultra-high dimensional right censored data
- Robust model-free feature screening for ultrahigh dimensional surrogate data
- Stable feature screening for ultrahigh dimensional data
- Nonparametric independence feature screening for ultrahigh-dimensional survival data
- Conditional sure independence screening by conditional marginal empirical likelihood
- Conditional-quantile screening for ultrahigh-dimensional survival data via martingale difference correlation
- Model-free feature screening for high-dimensional survival data
- Factor-adjusted multiple testing of correlations
- Model-free feature screening via distance correlation for ultrahigh dimensional survival data
- Conditional screening for ultrahigh-dimensional survival data in case-cohort studies
- Non-marginal feature screening for varying coefficient competing risks model
- Variable screening for varying coefficient models with ultrahigh-dimensional survival data
- Model-free feature screening for ultrahigh dimensional classification
- Model free feature screening for ultrahigh dimensional covariates with right censored outcomes
- Composite quantile regression for ultra-high dimensional semiparametric model averaging
- On the oracle property of a generalized adaptive elastic-net for multivariate linear regression with a diverging number of parameters
- Adjusted Pearson chi-square feature screening for multi-classification with ultrahigh dimensional data
- A New Model-Free Feature Screening Procedure for Ultrahigh-Dimensional Interval-Censored Failure Time Data
- Bayesian sparse reduced rank multivariate regression
- On the accuracy in high-dimensional linear models and its application to genomic selection
- Generalized regression estimators with high-dimensional covariates
- Feature screening for high-dimensional survival data via censored quantile correlation
- On reject and refine options in multicategory classification
- Extended differential geometric LARS for high-dimensional GLMs with general dispersion parameter
- Conditional characteristic feature screening for massive imbalanced data
- A simple model-free survival conditional feature screening
- Optimal directional statistic for general regression
- Model selection using mass-nonlocal prior
- Covariance-insured screening
- Feature screening in ultrahigh-dimensional partially linear models with missing responses at random
- Regression adjustment for treatment effect with multicollinearity in high dimensions
- A method for selecting the relevant dimensions for high-dimensional classification in singular vector spaces
- Random projections as regularizers: learning a linear discriminant from fewer observations than dimensions
- Modified SCAD penalty for constrained variable selection problems
- Nonconvex penalized ridge estimations for partially linear additive models in ultrahigh dimension
- Feature elimination in kernel machines in moderately high dimensions
- An RKHS model for variable selection in functional linear regression
- Robust feature screening for high-dimensional survival data
- Portal nodes screening for large scale social networks
- Feature screening for network autoregression model
- Variable screening for ultrahigh dimensional heterogeneous data via conditional quantile correlations
- Hypothesis testing sure independence screening for nonparametric regression
- Variable screening for high dimensional time series
- Fused variable screening for massive imbalanced data
- A nonparametric feature screening method for ultrahigh-dimensional missing response
- Approximate least squares estimation for spatial autoregressive models with covariates
- Feature screening for ultrahigh dimensional categorical data with covariates missing at random
- A note on quantile feature screening via distance correlation
- New hard-thresholding rules based on data splitting in high-dimensional imbalanced classification
- Determining cutoff point of ensemble trees based on sample size in predicting clinical dose with DNA microarray data
- I-LAMM for sparse learning: simultaneous control of algorithmic complexity and statistical error
- Principles of experimental design for big data analysis
- Optimal estimation of slope vector in high-dimensional linear transformation models
- Robust feature screening for elliptical copula regression model
- Projection quantile correlation and its use in high-dimensional grouped variable screening
- Stable prediction in high-dimensional linear models
- Feature screening with large‐scale and high‐dimensional survival data
- Variable selection for partially linear models via Bayesian subset modeling with diffusing prior
- Feature screening for ultrahigh-dimensional survival data when failure indicators are missing at random
- Testing for Neglected Nonlinearity Using Regularized Artificial Neural Networks
- Simultaneous Clustering and Estimation of Heterogeneous Graphical Models
- Feature selection in finite mixture of sparse normal linear models in high-dimensional feature space
- An RKHS-based approach to double-penalized regression in high-dimensional partially linear models
- Pruning variable selection ensembles
- Least-Square Approximation for a Distributed System
This page was built for publication: Sure independence screening for ultrahigh dimensional feature space. With discussion and authors' reply
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q4632602)