Fast rates of minimum error entropy with heavy-tailed noise
From MaRDI portal
Publication:2168008
DOI10.1016/j.jat.2022.105796OpenAlexW4285013867WikidataQ114164892 ScholiaQ114164892MaRDI QIDQ2168008
Publication date: 31 August 2022
Published in: Journal of Approximation Theory (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/j.jat.2022.105796
convergence rateconcentration inequalitymoment conditionminimum error entropysaturation effectempirical risk
Computational learning theory (68Q32) Learning and adaptive systems in artificial intelligence (68T05) Harmonic analysis on Euclidean spaces (42-XX) Approximations and expansions (41-XX)
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- A Mathematical Theory of Communication
- Consistency analysis of an empirical minimum error entropy algorithm
- Maximum correntropy Kalman filter
- Cross-entropy measure of uncertain variables
- Learning under \((1 + \epsilon)\)-moment conditions
- Model selection for regularized least-squares algorithm in learning theory
- On regularization algorithms in learning theory
- Empirical risk minimization for heavy-tailed losses
- Robustness of reweighted least squares kernel based regression
- A linear functional strategy for regularized ranking
- Blind source separation using Renyi's \(\alpha\)-marginal entropies.
- The covering number in learning theory
- Least squares after model selection in high-dimensional sparse models
- Entropy controlled Laplacian regularization for least square regression
- Robust pairwise learning with Huber loss
- Distributed kernel gradient descent algorithm for minimum error entropy principle
- Learning with correntropy-induced losses for regression with mixture of symmetric stable noise
- Kernel-based sparse regression with the correntropy-induced loss
- Consistency and robustness of kernel-based regression in convex risk minimization
- A tutorial on the cross-entropy method
- Empirical minimization
- Local Rademacher complexities
- The MEE Principle in Data Classification: A Perceptron-Based Analysis
- Learning Theory
- Bayesian Framework for Least-Squares Support Vector Machine Classifiers, Gaussian Processes, and Kernel Fisher Discriminant Analysis
- Correntropy: Properties and Applications in Non-Gaussian Signal Processing
- Generalized correlation function: definition, properties, and application to blind equalization
- Robust Hyperspectral Unmixing With Correntropy-Based Metric
- Convergence of Gradient Descent for Minimum Error Entropy Principle in Linear Regression
- Entropy-based algorithms for best basis selection
- A Regularized Correntropy Framework for Robust Pattern Recognition
- Regularization schemes for minimum error entropy principle
- Probability Inequalities for Sums of Bounded Random Variables
- Aggregation of regularized solutions from multiple observation models
- Robust Estimation of a Location Parameter
- Optimal learning with Gaussians and correntropy loss