Information Theoretic Learning

From MaRDI portal
Publication:5187983

DOI10.1007/978-1-4419-1570-2zbMath1206.94003OpenAlexW4230367971MaRDI QIDQ5187983

Jose C. Principe

Publication date: 10 March 2010

Published in: Information Science and Statistics (Search for Journal in Brave)

Full work available at URL: https://doi.org/10.1007/978-1-4419-1570-2




Related Items (86)

Probabilistic Learning Vector Quantization with Cross-Entropy for Probabilistic Class Assignments in Classification LearningRobust parametric identification procedure of stochastic nonlinear continuous-discrete systemsInformation-theoretic approaches to SVM feature selection for metagenome read classificationLearning theory of minimum error entropy under weak moment conditionsTsallis and Rényi divergences of generalized Jacobi polynomialsImproved minimum entropy filtering for continuous nonlinear non-Gaussian systems using a generalized density evolution equationConsistency analysis of an empirical minimum error entropy algorithmKernel Entropy Discriminant Analysis for Dimension ReductionProportionate minimum error entropy algorithm for sparse system identificationA note on the W-S lower bound of the MEE estimationAn extended result on the optimal estimation under the minimum error entropy criterionA Statistical Learning Approach to Modal RegressionConditions for the existence of a generalization of Rényi divergenceFisher information framework for time series modelingOn the robustness of regularized pairwise learning methods based on kernelsIntegrative exploration of large high-dimensional datasetsUnnamed ItemStatistical Problem Classes and Their Links to Information TheorySteady-state tracking analysis of adaptive filter with maximum correntropy criterionGradient descent for robust kernel-based regressionInformation potential for some probability density functionsKernel-based sparse regression with the correntropy-induced lossMixture quantized error entropy for recursive least squares adaptive filteringTowards understanding sparse filtering: a theoretical perspectiveSquare-root algorithms for maximum correntropy estimation of linear discrete-time systems in presence of non-Gaussian noiseUnderstanding autoencoders with information theoretic conceptsMaximum correntropy unscented filterA new robust fixed-point algorithm and its convergence analysisFinite-size gap, magnetization, and entanglement of deformed Fredkin spin chainIdentification of the ARX model with random impulse noise based on forgetting factor multi-error information entropyOn the characterization of cognitive tasks using activity-specific short-lived synchronization between electroencephalography channelsKullback–Leibler-Quadratic Optimal ControlMultierror stochastic gradient algorithm for identification of a Hammerstein system with random noise and its application in the modeling of a continuous stirring tank reactorSome further results on the minimum error entropy estimationOn the smoothed minimum error entropy criterionIdentification of an ARX model with impulse noise using a variable step size information gradient algorithm based on the kurtosis and minimum Renyi error entropyKernel-based maximum correntropy criterion with gradient descent methodMultiscale principle of relevant information for hyperspectral image classificationSpectral-spatial destriping of hyperspectral image via correntropy based sparse representation and unidirectional Huber–Markov random fieldsCorrentropy based model predictive controller with multi-constraints for robust path trajectory tracking of self-driving vehicleRegularized discriminant entropy analysisRényi entropy of the totally asymmetric exclusion processOn the Modification of Mellin Convolution Operator and Its Associated Information PotentialAspects in classification learning -- review of recent developments in learning vector quantizationAppropriate Data Density Models in Probabilistic Machine Learning Approaches for Data AnalysisRobust adaptive filtering based on M-estimation-based minimum error entropy criterionRobust stochastic configuration networks with maximum correntropy criterion for uncertain data regressionCausality detection with matrix-based transfer entropyBernoulli sums and Rényi entropy inequalitiesInequality-induced similarity measures for radial electronic probability densities in hydrogenic atomsDelay Differential Analysis of Electroencephalographic DataParameter Identifiability in Statistical Machine Learning: A ReviewGraph ambiguityA Robust Regression Framework with Laplace Kernel-Induced LossStability and optimization error of stochastic gradient descent for pairwise learningOnline minimum error entropy algorithm with unbounded samplingOn the optimization properties of the correntropic loss function in data analysisKernel gradient descent algorithm for information theoretic learningModeling the Uncertainty of a Set of Graphs Using Higher-Order Fuzzy SetsConcept drift detection and adaptation with hierarchical hypothesis testingMaximum correntropy Kalman filterGeneralized maximum correntropy detector for non‐Gaussian environmentsRobustness analysis of a maximum correntropy framework for linear regressionImproving discrimination in data envelopment analysis without losing information based on Renyi's entropyNote on von Neumann and Rényi entropies of a graphAnalysis of a novel density matching criterion within the ITL framework for blind channel equalizationStatistical estimation of quadratic Rényi entropy for a stationarym-dependent sequenceMaximum correntropy adaptation approach for robust compressive sensing reconstructionThe Nyström minimum kernel risk-sensitive loss algorithm with \(k\)-means samplingInterpretable fault detection using projections of mutual information matrixA Regularized Correntropy Framework for Robust Pattern RecognitionAn empirical evaluation of the approximation of subjective logic operators using Monte Carlo simulationsProjection theorems and estimating equations for power-law modelsRobust support vector machines based on the rescaled hinge loss functionEstimation of entropy-type integral functionalsA novel nonparametric distance estimator for densities with error boundsLearning with correntropy-induced losses for regression with mixture of symmetric stable noiseComparison of the convergence rates of the new correntropy-based Levenberg-Marquardt (CLM) method and the fixed-point maximum correntropy (FP-MCC) algorithmVirtual Sensors for Semiconductor Manufacturing: A Nonparametric Approach - Exploiting Information Theoretic Learning and Kernel MachinesNew Insights Into Learning With Correntropy-Based RegressionToward a Kernel-Based Uncertainty Decomposition Framework for Data and ModelsHierarchical Dynamical Model for Multiple Cortical Neural DecodingA Framework of Learning Through Empirical Gain MaximizationRegularization schemes for minimum error entropy principleOptimal learning with Gaussians and correntropy lossOn entropy, entropy-like quantities, and applications




This page was built for publication: Information Theoretic Learning