On detecting the optimal structure of a neural network under strong statistical features in errors
From MaRDI portal
Publication:4979103
artificial neural networksheteroskedasticitysimulationGARCH modelswild bootstrapLagrange multiplier testsneglected nonlinearity tests
Time series, auto-correlation, regression, etc. in statistics (GARCH) (62M10) Applications of statistics to biology and medical sciences; meta analysis (62P10) Sequential statistical analysis (62L10) Non-Markovian processes: hypothesis testing (62M07) Neural nets and related approaches to inference from stochastic processes (62M45)
Recommendations
- scientific article; zbMATH DE number 1368813
- Neural networks and statistical inference: seeking robust and efficient learning
- scientific article; zbMATH DE number 67623
- The strong consistency of error probability estimates in nn discrimination
- An improved determination approach to the structure and parameters of dynamic structure-based neural networks
- Probabilistic robustness estimates for feed-forward neural networks
- A statistical model of neural network learning via the Cramer-Rao lower bound
Cites work
- scientific article; zbMATH DE number 4090638 (Why is no real title available?)
- scientific article; zbMATH DE number 1168350 (Why is no real title available?)
- Autoregressive Conditional Heteroscedasticity with Estimates of the Variance of United Kingdom Inflation
- Bootstrap and wild bootstrap for high dimensional linear models
- Bootstrap procedures under some non-i.i.d. models
- Bootstrapping autoregressions with conditional heteroskedasticity of unknown form
- Conditional Heteroskedasticity in Asset Returns: A New Approach
- Controlling the finite sample significance levels of heteroskedasticity-robust tests of several linear restrictions on regression coefficients
- Diagnostic Checking in a Flexible Nonlinear Time Series Model
- Generalized autoregressive conditional heteroscedasticity
- Hypothesis Testing When a Nuisance Parameter is Present Only Under the Alternative
- Jackknife, bootstrap and other resampling methods in regression analysis
- Multilayer feedforward networks are universal approximators
- On the application of robust, regression-based diagnostics to models of conditional means and conditional variances
- Testing for neglected nonlinearity in time series models. A comparison of neural network methods and alternative tests
- The wild bootstrap and heteroskedasticity-robust tests for serial correlation in dynamic regression models
- The wild bootstrap, tamed at last
Cited in
(2)
This page was built for publication: On detecting the optimal structure of a neural network under strong statistical features in errors
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q4979103)