The following pages link to Information Theoretic Learning (Q5187983):
Displayed 50 items.
- On entropy, entropy-like quantities, and applications (Q256893) (← links)
- Improved minimum entropy filtering for continuous nonlinear non-Gaussian systems using a generalized density evolution equation (Q280433) (← links)
- Consistency analysis of an empirical minimum error entropy algorithm (Q285539) (← links)
- Proportionate minimum error entropy algorithm for sparse system identification (Q296351) (← links)
- A note on the W-S lower bound of the MEE estimation (Q296442) (← links)
- An extended result on the optimal estimation under the minimum error entropy criterion (Q296457) (← links)
- On the robustness of regularized pairwise learning methods based on kernels (Q325147) (← links)
- Some further results on the minimum error entropy estimation (Q406075) (← links)
- On the smoothed minimum error entropy criterion (Q406217) (← links)
- Maximum correntropy Kalman filter (Q503143) (← links)
- Note on von Neumann and Rényi entropies of a graph (Q513257) (← links)
- Robustness analysis of a maximum correntropy framework for linear regression (Q680523) (← links)
- A novel nonparametric distance estimator for densities with error bounds (Q742711) (← links)
- Regularized discriminant entropy analysis (Q898335) (← links)
- Aspects in classification learning -- review of recent developments in learning vector quantization (Q902754) (← links)
- Tsallis and Rényi divergences of generalized Jacobi polynomials (Q1619732) (← links)
- Integrative exploration of large high-dimensional datasets (Q1647602) (← links)
- Square-root algorithms for maximum correntropy estimation of linear discrete-time systems in presence of non-Gaussian noise (Q1678568) (← links)
- A new robust fixed-point algorithm and its convergence analysis (Q1684922) (← links)
- Concept drift detection and adaptation with hierarchical hypothesis testing (Q1738599) (← links)
- Improving discrimination in data envelopment analysis without losing information based on Renyi's entropy (Q1989807) (← links)
- Analysis of a novel density matching criterion within the ITL framework for blind channel equalization (Q2003155) (← links)
- Maximum correntropy adaptation approach for robust compressive sensing reconstruction (Q2004743) (← links)
- The Nyström minimum kernel risk-sensitive loss algorithm with \(k\)-means sampling (Q2005420) (← links)
- Interpretable fault detection using projections of mutual information matrix (Q2027446) (← links)
- Projection theorems and estimating equations for power-law models (Q2034450) (← links)
- Conditions for the existence of a generalization of Rényi divergence (Q2141828) (← links)
- Fisher information framework for time series modeling (Q2145602) (← links)
- Towards understanding sparse filtering: a theoretical perspective (Q2179299) (← links)
- Understanding autoencoders with information theoretic concepts (Q2185600) (← links)
- Kernel-based maximum correntropy criterion with gradient descent method (Q2191846) (← links)
- Robust stochastic configuration networks with maximum correntropy criterion for uncertain data regression (Q2200707) (← links)
- Kernel gradient descent algorithm for information theoretic learning (Q2223567) (← links)
- An empirical evaluation of the approximation of subjective logic operators using Monte Carlo simulations (Q2283288) (← links)
- Robust support vector machines based on the rescaled hinge loss function (Q2290317) (← links)
- Learning with correntropy-induced losses for regression with mixture of symmetric stable noise (Q2300760) (← links)
- Comparison of the convergence rates of the new correntropy-based Levenberg-Marquardt (CLM) method and the fixed-point maximum correntropy (FP-MCC) algorithm (Q2312514) (← links)
- Information-theoretic approaches to SVM feature selection for metagenome read classification (Q2362468) (← links)
- Steady-state tracking analysis of adaptive filter with maximum correntropy criterion (Q2400927) (← links)
- Kernel-based sparse regression with the correntropy-induced loss (Q2409039) (← links)
- Graph ambiguity (Q2446033) (← links)
- On the optimization properties of the correntropic loss function in data analysis (Q2448161) (← links)
- Information potential for some probability density functions (Q2660369) (← links)
- Mixture quantized error entropy for recursive least squares adaptive filtering (Q2667442) (← links)
- Bernoulli sums and Rényi entropy inequalities (Q2692548) (← links)
- Inequality-induced similarity measures for radial electronic probability densities in hydrogenic atoms (Q2696370) (← links)
- Estimation of entropy-type integral functionals (Q2807736) (← links)
- Modeling the Uncertainty of a Set of Graphs Using Higher-Order Fuzzy Sets (Q3299861) (← links)
- Gradient descent for robust kernel-based regression (Q4571003) (← links)
- Finite-size gap, magnetization, and entanglement of deformed Fredkin spin chain (Q4589372) (← links)