Tail inverse regression: dimension reduction for prediction of extremes
From MaRDI portal
Publication:6137714
DOI10.3150/23-BEJ1606arXiv2108.01432OpenAlexW4388506941MaRDI QIDQ6137714FDOQ6137714
Authors: Anass Aghbalou, François Portier, Anne Sabourin, Chen Zhou
Publication date: 16 January 2024
Published in: Bernoulli (Search for Journal in Brave)
Abstract: We consider the problem of supervised dimension reduction with a particular focus on extreme values of the target to be explained by a covariate vector . The general purpose is to define and estimate a projection on a lower dimensional subspace of the covariate space which is sufficient for predicting exceedances of the target above high thresholds. We propose an original definition of Tail Conditional Independence which matches this purpose. Inspired by Sliced Inverse Regression (SIR) methods, we develop a novel framework (TIREX, Tail Inverse Regression for EXtreme response) in order to estimate an extreme sufficient dimension reduction (SDR) space of potentially smaller dimension than that of a classical SDR space. We prove the weak convergence of tail empirical processes involved in the estimation procedure and we illustrate the relevance of the proposed approach on simulated and real world data.
Full work available at URL: https://arxiv.org/abs/2108.01432
Cites Work
- Scikit-learn: machine learning in Python
- A constructive approach to the estimation of dimension reduction directions
- Weak convergence and empirical processes. With applications to statistics
- Asymptotic Statistics
- Weak convergence of empirical copula processes
- Sliced Inverse Regression for Dimension Reduction
- Statistics of Extremes
- Bootstrap testing of the rank of a matrix via least-squared constrained estimation
- Sliced Inverse Regression with Regularizations
- Extreme value theory. An introduction.
- Title not available (Why is that?)
- Comment
- On almost linearity of low dimensional projections from high dimensional data
- Weighted empirical and quantile processes
- Title not available (Why is that?)
- Investigating Smooth Multiple Regression by the Method of Average Derivatives
- Title not available (Why is that?)
- Dimension reduction in regressions through cumulative slicing estimation
- Extended conditional independence and applications in causal inference
- Kernel dimension reduction in regression
- Title not available (Why is that?)
- Common risk factors in the returns on stocks and bonds
- Sufficient Dimension Reduction via Inverse Regression
- Hidden regular variation, second order regular variation and asymptotic independence
- Structure adaptive approach for dimension reduction.
- Dimension reduction for conditional mean in regression
- On semiparametric \(M\)-estimation in single-index regression
- Heavy-Tail Phenomena
- The normal distribution. Characterizations with applications
- Regularly varying functions
- Sparse regular variation
- Structured variable selection with sparsity-inducing norms
- Strong limit theorems for weighted quantile processes
- A characterization of spherical distributions
- A new algorithm for estimating the effective dimension-reduction subspace
- Determining the dependence structure of multivariate extremes
- Robust bounds in multivariate extremes
- Dimension reduction in multivariate extreme value analysis
- Identifying groups of variables with the potential of being large simultaneously
- Sparse representation of multivariate extremes with applications to anomaly detection
- A multivariate extreme value theory approach to anomaly clustering and visualization
- Tail dimension reduction for extreme quantile estimation
- Central quantile subspace
- Graphical Models for Extremes
- Decompositions of dependence for high-dimensional extremes
- Principal component analysis for multivariate extremes
- Optimal transformation: a new approach for covering the central subspace
- One-component regular variation and graphical modeling of extremes
- Slice inverse regression with score functions
- \(k\)-means clustering of extremes
- Extreme partial least-squares
- Inference on extremal dependence in the domain of attraction of a structured Hüsler-Reiss distribution motivated by a Markov tree with latent variables
- An empirical process view of inverse regression
Cited In (1)
This page was built for publication: Tail inverse regression: dimension reduction for prediction of extremes
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6137714)