On dimension folding of matrix- or array-valued statistical objects
From MaRDI portal
Publication:2380098
Abstract: We consider dimension reduction for regression or classification in which the predictors are matrix- or array-valued. This type of predictor arises when measurements are obtained for each combination of two or more underlying variables--for example, the voltage measured at different channels and times in electroencephalography data. For these applications, it is desirable to preserve the array structure of the reduced predictor (e.g., time versus channel), but this cannot be achieved within the conventional dimension reduction formulation. In this paper, we introduce a dimension reduction method, to be called dimension folding, for matrix- and array-valued predictors that preserves the array structure. In an application of dimension folding to an electroencephalography data set, we correctly classify 97 out of 122 subjects as alcoholic or nonalcoholic based on their electroencephalography in a cross-validation sample.
Recommendations
- Dimension folding PCA and PFC for matrix-valued predictors
- Ensemble sufficient dimension folding methods for analyzing matrix-valued data
- Sufficient dimension folding for a functional of conditional distribution of matrix- or array-valued objects
- Sufficient dimension reduction and prediction in regression
- Dimension reduction in regression without matrix inversion
Cites work
- scientific article; zbMATH DE number 5017166 (Why is no real title available?)
- scientific article; zbMATH DE number 4060392 (Why is no real title available?)
- scientific article; zbMATH DE number 1220060 (Why is no real title available?)
- scientific article; zbMATH DE number 1953122 (Why is no real title available?)
- scientific article; zbMATH DE number 1850469 (Why is no real title available?)
- Applied multivariate statistical analysis.
- Asymptotics for kernel estimate of sliced inverse regression
- Augmenting the bootstrap to analyze high dimensional genomic data
- Comments on: Augmenting the bootstrap to analyze high dimensional genomic data
- Contour regression: a general approach to dimension reduction
- Determining the dimension of iterative Hessian transformation
- Dimension Reduction for the Conditionalkth Moment in Regression
- Dimension reduction for conditional mean in regression
- Dimension reduction for nonelliptically distributed predictors
- Dimension reduction in regression without matrix inversion
- Dimension reduction strategies for analyzing global gene expression data with a response
- Envelope models for parsimonious and efficient multivariate linear regression
- Estimating the structural dimension of regressions via parametric inverse regression
- Generalized Inverses, Ridge Regression, Biased Linear Estimation, and Nonlinear Estimation
- Graphics for Regressions With a Binary Response
- On Directional Regression for Dimension Reduction
- On Principal Hessian Directions for Data Visualization and Dimension Reduction: Another Application of Stein's Lemma
- On a projective resampling method for dimension reduction with multivariate responses
- On the Interpretation of Regression Plots
- Robust dimension reduction based on canonical correlation
- Sliced Inverse Regression for Dimension Reduction
- Slicing regression: A link-free regression method
- Successive direction extraction for estimating the central subspace in a multiple-index regres\-sion
- Sufficient Dimension Reduction via Inverse Regression
- Sufficient dimension reduction and graphics in regression
- The commutation matrix: Some properties and applications
- Using the Bootstrap to Select One of a New Class of Dimension Reduction Methods
Cited in
(48)- A Penalized Likelihood Method for Classification With Matrix-Valued Predictors
- Latent factor model for multivariate functional data
- Fast and Separable Estimation in High-Dimensional Tensor Gaussian Graphical Models
- Conditional mean dimension reduction for tensor time series
- Low-rank latent matrix-factor prediction modeling for generalized high-dimensional matrix-variate regression
- Higher-order sliced inverse regressions
- Tensor sufficient dimension reduction
- Predictive power of principal components for single-index model and sufficient dimension reduction
- Model-Free Variable Selection With Matrix-Valued Predictors
- A varying coefficient model with matrix valued covariates
- Sliced inverse regression for integrative multi-omics data analysis
- Sufficient dimension folding via tensor inverse regression
- On the asymptotic normality and efficiency of Kronecker envelope principal component analysis
- A model-free conditional screening approach via sufficient dimension reduction
- A classifier for multi-dimensional datasets based on Bayesian multiple kernel grouping learning
- Sufficient dimension folding for a functional of conditional distribution of matrix- or array-valued objects
- Sparse multivariate functional principal component analysis
- Copula Gaussian Graphical Models for Functional Data
- Functional Graphical Models
- Matrix regression heterogeneity analysis
- Sparse minimum discrepancy approach to sufficient dimension reduction with simultaneous variable selection in ultrahigh dimension
- Likelihood-Based Dimension Folding on Tensor Data
- Tensor Regression with Applications in Neuroimaging Data Analysis
- Robust estimator of the correlation matrix with sparse Kronecker structure for a high-dimensional matrix-variate
- Structured time-dependent inverse regression (STIR)
- A general framework for tensor screening through smoothing
- Nonparametric Functional Graphical Modeling Through Functional Additive Regression Operator
- Dimension folding PCA and PFC for matrix-valued predictors
- Quad folding: a simple idea for the subjective property characterization of large sample sets
- Hypothesis Testing of Matrix Graph Model with Application to Brain Connectivity Analysis
- On sufficient dimension reduction via principal asymmetric least squares
- JADE for Tensor-Valued Observations
- Multiple-population shrinkage estimation via sliced inverse regression
- On testing common indices for two multi-index models: a link-free approach
- Statistical inference on the significance of rows and columns for matrix-valued data in an additive model
- Sliced average variance estimation for tensor data
- scientific article; zbMATH DE number 7415090 (Why is no real title available?)
- Dimension Reduction for Fréchet Regression
- Independent component analysis for tensor-valued data
- A portmanteau local feature discrimination approach to the classification with high-dimensional matrix-variate data
- Ensemble sufficient dimension folding methods for analyzing matrix-valued data
- Covariance estimation via sparse Kronecker structures
- Tensor sliced inverse regression
- A Nonparametric Graphical Model for Functional Data With Application to Brain Networks Based on fMRI
- scientific article; zbMATH DE number 7370581 (Why is no real title available?)
- Improved estimation in tensor regression with multiple change-points
- Trace pursuit variable selection for multi-population data
- Hyperparameter estimation for sparse Bayesian learning models
This page was built for publication: On dimension folding of matrix- or array-valued statistical objects
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2380098)