On dimension folding of matrix- or array-valued statistical objects
From MaRDI portal
Publication:2380098
Abstract: We consider dimension reduction for regression or classification in which the predictors are matrix- or array-valued. This type of predictor arises when measurements are obtained for each combination of two or more underlying variables--for example, the voltage measured at different channels and times in electroencephalography data. For these applications, it is desirable to preserve the array structure of the reduced predictor (e.g., time versus channel), but this cannot be achieved within the conventional dimension reduction formulation. In this paper, we introduce a dimension reduction method, to be called dimension folding, for matrix- and array-valued predictors that preserves the array structure. In an application of dimension folding to an electroencephalography data set, we correctly classify 97 out of 122 subjects as alcoholic or nonalcoholic based on their electroencephalography in a cross-validation sample.
Recommendations
- Dimension folding PCA and PFC for matrix-valued predictors
- Ensemble sufficient dimension folding methods for analyzing matrix-valued data
- Sufficient dimension folding for a functional of conditional distribution of matrix- or array-valued objects
- Sufficient dimension reduction and prediction in regression
- Dimension reduction in regression without matrix inversion
Cites work
- scientific article; zbMATH DE number 5017166 (Why is no real title available?)
- scientific article; zbMATH DE number 4060392 (Why is no real title available?)
- scientific article; zbMATH DE number 1220060 (Why is no real title available?)
- scientific article; zbMATH DE number 1953122 (Why is no real title available?)
- scientific article; zbMATH DE number 1850469 (Why is no real title available?)
- Applied multivariate statistical analysis.
- Asymptotics for kernel estimate of sliced inverse regression
- Augmenting the bootstrap to analyze high dimensional genomic data
- Comments on: Augmenting the bootstrap to analyze high dimensional genomic data
- Contour regression: a general approach to dimension reduction
- Determining the dimension of iterative Hessian transformation
- Dimension Reduction for the Conditionalkth Moment in Regression
- Dimension reduction for conditional mean in regression
- Dimension reduction for nonelliptically distributed predictors
- Dimension reduction in regression without matrix inversion
- Dimension reduction strategies for analyzing global gene expression data with a response
- Envelope models for parsimonious and efficient multivariate linear regression
- Estimating the structural dimension of regressions via parametric inverse regression
- Generalized Inverses, Ridge Regression, Biased Linear Estimation, and Nonlinear Estimation
- Graphics for Regressions With a Binary Response
- On Directional Regression for Dimension Reduction
- On Principal Hessian Directions for Data Visualization and Dimension Reduction: Another Application of Stein's Lemma
- On a projective resampling method for dimension reduction with multivariate responses
- On the Interpretation of Regression Plots
- Robust dimension reduction based on canonical correlation
- Sliced Inverse Regression for Dimension Reduction
- Slicing regression: A link-free regression method
- Successive direction extraction for estimating the central subspace in a multiple-index regres\-sion
- Sufficient Dimension Reduction via Inverse Regression
- Sufficient dimension reduction and graphics in regression
- The commutation matrix: Some properties and applications
- Using the Bootstrap to Select One of a New Class of Dimension Reduction Methods
Cited in
(48)- Predictive power of principal components for single-index model and sufficient dimension reduction
- A portmanteau local feature discrimination approach to the classification with high-dimensional matrix-variate data
- A classifier for multi-dimensional datasets based on Bayesian multiple kernel grouping learning
- A model-free conditional screening approach via sufficient dimension reduction
- Sliced inverse regression for integrative multi-omics data analysis
- Hypothesis Testing of Matrix Graph Model with Application to Brain Connectivity Analysis
- Functional Graphical Models
- Sliced average variance estimation for tensor data
- Tensor sufficient dimension reduction
- Ensemble sufficient dimension folding methods for analyzing matrix-valued data
- scientific article; zbMATH DE number 7370581 (Why is no real title available?)
- Independent component analysis for tensor-valued data
- Sufficient dimension folding via tensor inverse regression
- Sufficient dimension folding for a functional of conditional distribution of matrix- or array-valued objects
- On testing common indices for two multi-index models: a link-free approach
- Hyperparameter estimation for sparse Bayesian learning models
- Latent factor model for multivariate functional data
- Covariance estimation via sparse Kronecker structures
- Quad folding: a simple idea for the subjective property characterization of large sample sets
- Nonparametric Functional Graphical Modeling Through Functional Additive Regression Operator
- Sparse minimum discrepancy approach to sufficient dimension reduction with simultaneous variable selection in ultrahigh dimension
- A general framework for tensor screening through smoothing
- Robust estimator of the correlation matrix with sparse Kronecker structure for a high-dimensional matrix-variate
- A varying coefficient model with matrix valued covariates
- On the asymptotic normality and efficiency of Kronecker envelope principal component analysis
- Structured time-dependent inverse regression (STIR)
- Conditional mean dimension reduction for tensor time series
- Low-rank latent matrix-factor prediction modeling for generalized high-dimensional matrix-variate regression
- Tensor sliced inverse regression
- Improved estimation in tensor regression with multiple change-points
- On sufficient dimension reduction via principal asymmetric least squares
- Dimension folding PCA and PFC for matrix-valued predictors
- Statistical inference on the significance of rows and columns for matrix-valued data in an additive model
- Dimension Reduction for Fréchet Regression
- Likelihood-Based Dimension Folding on Tensor Data
- A Nonparametric Graphical Model for Functional Data With Application to Brain Networks Based on fMRI
- Trace pursuit variable selection for multi-population data
- Copula Gaussian Graphical Models for Functional Data
- scientific article; zbMATH DE number 7415090 (Why is no real title available?)
- Fast and Separable Estimation in High-Dimensional Tensor Gaussian Graphical Models
- Tensor Regression with Applications in Neuroimaging Data Analysis
- Model-Free Variable Selection With Matrix-Valued Predictors
- A Penalized Likelihood Method for Classification With Matrix-Valued Predictors
- Multiple-population shrinkage estimation via sliced inverse regression
- Higher-order sliced inverse regressions
- JADE for Tensor-Valued Observations
- Sparse multivariate functional principal component analysis
- Matrix regression heterogeneity analysis
This page was built for publication: On dimension folding of matrix- or array-valued statistical objects
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2380098)