scientific article; zbMATH DE number 17222
zbMATH Open0739.62001MaRDI QIDQ3973919FDOQ3973919
Authors: Andrew R. Barron
Publication date: 26 June 1992
Title of this publication is not available (Why is that?)
Recommendations
- Practical complexity control in multilayer perceptrons
- Bounds on the complexity of neural‐network models and comparison with linear methods
- Complexity of learning in artificial neural networks
- Complexity and neural networks
- Neural networks and complexity theory
- Complexity matching in neural networks
- scientific article; zbMATH DE number 67614
- Publication:3484350
- Estimates of Data Complexity in Neural-Network Learning
- Approximation by Fully Complex Multilayer Perceptrons
model selectionartificial neural networksconsistencyparameter estimation errorrates of convergencepolynomial regressionapproximation errorrisk boundsstatistical estimation of functionscomplexity of modelscomplexity regularization criteriaindex of resolvabilityminimum description-length criterianear asymptotic optimality
Statistical aspects of information-theoretic topics (62B10) Density estimation (62G07) Asymptotic properties of nonparametric inference (62G20) Applications of statistics (62P99)
Cited In (47)
- Risk bounds for model selection via penalization
- Model selection in nonparametric regression
- Analysis of the rate of convergence of fully connected deep neural network regression estimates with smooth activation function
- Interpreting neural-network results: a simulation study.
- Smooth discrimination analysis
- Collocation approximation by deep neural ReLU networks for parametric and stochastic PDEs with lognormal inputs
- Model selection in reinforcement learning
- How well can a regression function be estimated if the distribution of the (random) design is concentrated on a finite set?
- Functional aggregation for nonparametric regression.
- Analysis of the rate of convergence of least squares neural network regression estimates in case of measurement errors
- Discriminatively regularized least-squares classification
- Quasicycles revisited: apparent sensitivity to initial conditions
- Density estimation by the penalized combinatorial method
- A new method for estimation and model selection: \(\rho\)-estimation
- Nonlinear orthogonal series estimates for random design regression
- Self-regulated complexity in neural networks
- Adaptive estimation in autoregression or \(\beta\)-mixing regression via model selection
- On deep learning as a remedy for the curse of dimensionality in nonparametric regression
- Neural networks and logistic regression: Part I
- On the mathematical foundations of learning
- Approximation and learning by greedy algorithms
- On the Problem in Model Selection of Neural Network Regression in Overrealizable Scenario
- Deep ReLU neural network approximation in Bochner spaces and applications to parametric PDEs
- Hybrid machine learning model for continuous microarray time series
- A multi-loss super regression learner (MSRL) with application to survival prediction using proteomics
- Title not available (Why is that?)
- On estimation of surrogate models for multivariate computer experiments
- Data-adaptive estimation of the treatment-specific mean
- Synchronous Boltzmann machines can be universal approximators
- Nonparametric estimation of low rank matrix valued function
- Optimal aggregation of classifiers in statistical learning.
- About the non-asymptotic behaviour of Bayes estimators
- Ridgelets: estimating with ridge functions
- On the rate of convergence of fully connected deep neural network regression estimates
- Information-theoretic determination of minimax rates of convergence
- Nonasymptotic bounds on the \(L_{2}\) error of neural network regression estimates
- Model selection based on minimum description length
- Title not available (Why is that?)
- Multiclass classification with potential function rules: margin distribution and generalization
- Suboptimal behavior of Bayes and MDL in classification under misspecification
- Title not available (Why is that?)
- Time-penalised trees (\texttt{TpT}): introducing a new tree-based data mining algorithm for time-varying covariates
- Gaussian model selection with an unknown variance
- Deep ReLU networks and high-order finite element methods
- Theory of Classification: a Survey of Some Recent Advances
- Model selection by bootstrap penalization for classification
- Deep learning in high dimension: neural network expression rates for generalized polynomial chaos expansions in UQ
This page was built for publication:
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q3973919)