Abstract: Let be a stationary process with associated lag operators . Uniform asymptotic expansions of the corresponding empirical eigenvalues and eigenfunctions are established under almost optimal conditions on the lag operators in terms of the eigenvalues (spectral gap). In addition, the underlying dependence assumptions are optimal, including both short and long memory processes. This allows us to study the relative maximum deviation of the empirical eigenvalues under very general conditions. Among other things, we show convergence to an extreme value distribution, giving rise to the construction of simultaneous confidence sets. We also discuss how the asymptotic expansions transfer to the long-run covariance operator in a general framework.
Recommendations
Cites work
- scientific article; zbMATH DE number 6290111 (Why is no real title available?)
- Adaptive functional linear regression
- Asymptotic Theory for Principal Component Analysis
- Asymptotic equivalence of functional linear regression and a white noise inverse problem
- Asymptotic theory for the principal component analysis of a vector random function: Some applications to statistical inference
- Asymptotics of prediction in functional linear regression with functional outputs
- CLT in functional linear regression models
- Central limit theorem for stationary linear processes
- Common functional principal components
- Cramér-Karhunen-Loève representation and harmonic principal component analysis of functional time series
- Detecting Changes in the Mean of Functional Observations
- Dispersion operators and resistant second-order functional data analysis
- Dynamic functional principal components
- Estimation in functional lagged regression
- Estimation of the Mean of Functional Time Series and a Two-Sample Problem
- Extremes and local dependence in stationary sequences
- Fourier analysis of stationary time series in function space
- Functional data analysis for volatility
- Functional data analysis with increasing number of projections
- Functional data analysis.
- Gaussian approximations and multiplier bootstrap for maxima of sums of high-dimensional random vectors
- Generalized autoregressive conditional heteroscedasticity
- Heteroskedasticity and Autocorrelation Consistent Covariance Matrix Estimation
- High-dimensional principal projections
- Identifying the finite dimensionality of curve time series
- Inference for functional data with applications
- Learning Theory
- Limit theorems for nonlinear functionals of a stationary Gaussian sequence of vectors
- Linear processes in function spaces. Theory and applications
- Methodology and convergence rates for functional linear regression
- Minimax adaptive tests for the functional linear model
- Minimax and adaptive prediction for functional linear regression
- Monitoring the intraday volatility pattern
- New dependence coefficients. Examples and applications to statistics
- Nonlinear system theory: Another look at dependence
- Nonparametric goodness-of-fit testing under Gaussian models
- On Properties of Functional Principal Components Analysis
- On the CLT for discrete Fourier transforms of functional time series
- On weak invariance principles for sums of dependent random functionals
- Optimal eigen expansions and uniform bounds
- Perturbation of spectral subspaces and solution of linear operator equations
- Prediction in functional linear regression
- Principal component analysis.
- Probability and moment inequalities under dependence
- SPECIFICATION TEST FOR CONDITIONAL DISTRIBUTION WITH FUNCTIONAL DATA
- Second-order comparison of Gaussian random functions and the geometry of DNA minicircles
- Sharp adaptation for inverse problems with random noise
- Sharp conditions for the CLT of linear processes in a Hilbert space
- Strong invariance principles for dependent random variables
- Test of independence for functional data
- Testing for the mean of random curves: a penalization approach
- Testing stationarity of functional time series
- Testing the equality of covariance operators in functional samples
- Testing the stability of the functional autoregressive process
- Theory for high-order bounds in functional principal components analysis
- Weakly dependent functional data
Cited in
(15)- Expanders that beat the eigenvalue bound: Explicit construction and applications
- Relative perturbation bounds with applications to empirical covariance operators
- Asymptotic properties of principal component projections with repeated eigenvalues
- On consistency and sparsity for high-dimensional functional time series with application to autoregressions
- Quantitative limit theorems and bootstrap approximations for empirical spectral projectors
- Optimal stretching for lattice points and eigenvalues
- Optimal expansion of subspaces for eigenvector approximations
- Optimal eigen expansions and uniform bounds
- Nonparametric density estimation for intentionally corrupted functional data
- Perturbation bounds for eigenspaces under a relative gap condition
- Optimal function-on-scalar regression over complex domains
- Optimal Bilaplacian Eigenvalues
- Nonasymptotic upper bounds for the reconstruction error of PCA
- Simultaneous inference and uniform test for eigensystems of functional data
- Deep spectral Q-learning with application to mobile health
This page was built for publication: Optimal eigen expansions and uniform bounds
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q343789)