scientific article; zbMATH DE number 7306873
From MaRDI portal
Publication:5148959
Ryumei Nakada, Masaaki Imaizumi
Publication date: 5 February 2021
Full work available at URL: https://arxiv.org/abs/1907.02177
Title: zbMATH Open Web Interface contents unavailable due to conflicting licenses.
Related Items (19)
Stationary Density Estimation of Itô Diffusions Using Deep Learning ⋮ Machine learning for prediction with missing dynamics ⋮ Intrinsic Dimension Adaptive Partitioning for Kernel Methods ⋮ Approximation bounds for norm constrained neural networks with applications to regression and GANs ⋮ Estimation of a regression function on a manifold by fully connected deep neural networks ⋮ A Deep Generative Approach to Conditional Sampling ⋮ The Kolmogorov-Arnold representation theorem revisited ⋮ Drift estimation for a multi-dimensional diffusion process using deep neural networks ⋮ Deep Neural Networks with ReLU-Sine-Exponential Activations Break Curse of Dimensionality in Approximation on Hölder Class ⋮ Deep nonparametric estimation of intrinsic data structures by chart autoencoders: generalization error and robustness ⋮ Neural network approximation and estimation of classifiers with classification boundary in a Barron class ⋮ Deep nonparametric regression on approximate manifolds: nonasymptotic error bounds with polynomial prefactors ⋮ Intrinsic and extrinsic deep learning on manifolds ⋮ Unnamed Item ⋮ Unnamed Item ⋮ Unnamed Item ⋮ Deep Network Approximation for Smooth Functions ⋮ Deep Network With Approximation Error Being Reciprocal of Width to Power of Square Root of Depth ⋮ Deep neural networks can stably solve high-dimensional, noisy, non-linear inverse problems
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Bayesian manifold regression
- Translated Poisson mixture model for stratification learning
- Approximation and estimation bounds for artificial neural networks
- Information-theoretic determination of minimax rates of convergence
- Weak convergence and empirical processes. With applications to statistics
- Optimal approximation of piecewise smooth functions using deep ReLU neural networks
- Nonparametric regression using deep neural networks with ReLU activation function
- On deep learning as a remedy for the curse of dimensionality in nonparametric regression
- Finding the homology of submanifolds with high confidence from random samples
- Mathematical Foundations of Infinite-Dimensional Statistical Models
- Universal approximation bounds for superpositions of a sigmoidal function
- Approximation by Combinations of ReLU and Squared ReLU Ridge Functions With <inline-formula> <tex-math notation="LaTeX">$\ell^1$ </tex-math> </inline-formula> and <inline-formula> <tex-math notation="LaTeX">$\ell^0$ </tex-math> </inline-formula> Controls
- High-Dimensional Statistics
- Laplacian Eigenmaps for Dimensionality Reduction and Data Representation
- Neural Network Learning
- Dimensions, Whitney covers, and tubular neighborhoods
- Breaking the Curse of Dimensionality with Convex Neural Networks
- Minimax Manifold Estimation
- Understanding Machine Learning
- An Algorithm for Finding Intrinsic Dimensionality of Data
This page was built for publication: