On Different Facets of Regularization Theory
From MaRDI portal
Recommendations
- Oversmoothing Tikhonov regularization in Banach spaces
- Regularization theory for ill-posed problems. Selected topics
- Low complexity regularization of linear inverse problems
- Modern regularization methods for inverse problems
- Iterative regularization with a general penalty term-theory and application to \(L^{1}\) and \(TV\) regularization
Cites work
- scientific article; zbMATH DE number 1164152 (Why is no real title available?)
- 10.1162/15324430260185646
- A Correspondence Between Bayesian Estimation on Stochastic Processes and Smoothing by Splines
- A Mathematical Theory of Communication
- A Penalty-Function Approach for Pruning Feedforward Neural Networks
- A new look at the statistical model identification
- A regularized solution to edge detection
- A statistical perspective on ill-posed inverse problems (with discussion)
- Adaptive regularization parameter selection method for enhancing generalization capability of neural networks
- An Approach to Time Series Analysis
- Analysis of Discrete Ill-Posed Problems by Means of the L-Curve
- Analysis of hidden nodes for multi-layer perceptron neural networks
- Atomic Decomposition by Basis Pursuit
- Entropy-based algorithms for best basis selection
- Flat Minima
- Generalization bounds for function approximation from scattered noisy data
- Generalization performance of regularization networks and support vector machines via entropy numbers of compact operators
- III-Posed problems early vision: from computational theory to analogue networks
- Interpolation of scattered data: distance matrices and conditionally positive definite functions
- Least third-order cumulant method with adaptive regularization parameter selection for neural networks
- Matching pursuits with time-frequency dictionaries
- Minimum description length induction, Bayesianism, and Kolmogorov complexity
- Modeling by shortest data description
- On Tikhonov regularization, bias and variance in nonlinear system identification
- On a class of support vector kernels based on frames in function Hilbert spaces
- On radial basis function nets and kernel regression: Statistical consistency, convergence rates, and receptive field size
- On the mathematical foundations of learning
- Pattern recognition as a quest for minimum entropy
- Probabilistic Solution of Ill-Posed Problems in Computational Vision
- RKHS approach to detection and estimation problems--I: Deterministic signals in Gaussian noise
- Regularization algorithms for learning that are equivalent to multilayer networks
- Regularization networks and support vector machines
- Some extensions of radial basis functions and their applications in artificial intelligence
- Some results on Tchebycheffian spline functions and stochastic processes
- Sparse on-line Gaussian processes
- Spline smoothing: The equivalent variable kernel method
- Stochastic Relaxation, Gibbs Distributions, and the Bayesian Restoration of Images
- Theory of Reproducing Kernels
Cited in
(17)- Convergence analyses on sparse feedforward neural networks via group lasso regularization
- Radial basis function network learning using localized generalization error bound
- Pointwise convergence of Fourier regularization for smoothing data
- Discriminatively regularized least-squares classification
- Assistive Optimal Control-on-Request with Application in Standing Balance Therapy and Reinforcement
- On the minimizers of energy forms with completely monotone kernel
- Online data processing: comparison of Bayesian regularized particle filters
- Learning with boundary conditions
- Foundations of support constraint machines
- Subspace based direction of arrival estimation of DS-CDMA signals using orthogonal projection
- Weak consistency of the support vector machine quantile regression approach when covariates are functions
- From kernel methods to neural networks: a unifying variational formulation
- Sparse regularization for semi-supervised classification
- Optimal recovery from inaccurate data in Hilbert spaces: regularize, but what of the parameter?
- On the uncertainty in the regularized solution
- Generating Spike Trains with Specified Correlation Coefficients
- Classifier learning with a new locality regularization method
This page was built for publication: On Different Facets of Regularization Theory
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q4815036)