Coordinatewise Gaussianization: Theories and Applications
From MaRDI portal
Publication:6185499
DOI10.1080/01621459.2022.2044825OpenAlexW4213449012MaRDI QIDQ6185499
Publication date: 8 January 2024
Published in: Journal of the American Statistical Association (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1080/01621459.2022.2044825
heavy tailsnormal score transformationnearest shrunken centroids classifierGaussianizationGaussianized distance correlation
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Measuring and testing dependence by correlation of distances
- Model-Free Feature Screening for Ultrahigh-Dimensional Data
- Sure independence screening in generalized linear models with NP-dimensionality
- Extending the rank likelihood for semiparametric copula estimation
- Sparse inverse covariance estimation with the graphical lasso
- Estimation of copula-based semiparametric time series models
- Local independence feature screening for nonparametric and semiparametric models by marginal empirical likelihood
- Marginal empirical likelihood and sure independence feature screening
- Influential features PCA for high dimensional clustering
- Robust rank correlation based screening
- Regularized rank-based estimation of high-dimensional nonparanormal graphical models
- Brownian distance covariance
- High-dimensional classification using features annealed independence rules
- High-dimensional additive modeling
- Efficient estimation in the bivariate normal copula model: Normal margins are least favourable
- Class prediction by nearest shrunken centroids, with applications to DNA microarrays.
- Robust feature screening for ultra-high dimensional right censored data via distance correlation
- High-dimensional semiparametric Gaussian copula graphical models
- High-dimensional covariance estimation by minimizing \(\ell _{1}\)-penalized log-determinant divergence
- Sparse semiparametric discriminant analysis
- Information bounds for Gaussian copulas
- High-dimensional graphs and variable selection with the Lasso
- The fused Kolmogorov filter: a nonparametric model-free screening method
- Nonparametric Independence Screening in Sparse Ultra-High-Dimensional Additive Models
- A Constrainedℓ1Minimization Approach to Sparse Precision Matrix Estimation
- A direct approach to sparse discriminant analysis in ultra-high dimensions
- A Direct Estimation Approach to Sparse Linear Discriminant Analysis
- The Future of Data Analysis
- Regularized quantile regression and robust feature screening for single index models
- Discriminant analysis through a semiparametric model
- High-Dimensional Gaussian Copula Regression: Adaptive Estimation and Statistical Inference
- Feature Screening via Distance Correlation Learning
- On Consistency and Sparsity for Principal Components Analysis in High Dimensions
- Partial Correlation Estimation by Joint Sparse Regression Models
- Model-Free Feature Screening for Ultrahigh Dimensional Discriminant Analysis
- Sparse precision matrix estimation via lasso penalized D-trace loss
- The Kolmogorov filter for variable screening in high-dimensional binary classification
- A Road to Classification in High Dimensional Space: The Regularized Optimal Affine Discriminant
- The elements of statistical learning. Data mining, inference, and prediction
This page was built for publication: Coordinatewise Gaussianization: Theories and Applications