Distributed kernel gradient descent algorithm for minimum error entropy principle (Q2175022): Difference between revisions

From MaRDI portal
Added link to MaRDI item.
Created claim: Wikidata QID (P12): Q128593977, #quickstatements; #temporary_batch_1723680031063
 
(3 intermediate revisions by 3 users not shown)
Property / MaRDI profile type
 
Property / MaRDI profile type: MaRDI publication profile / rank
 
Normal rank
Property / full work available at URL
 
Property / full work available at URL: https://doi.org/10.1016/j.acha.2019.01.002 / rank
 
Normal rank
Property / OpenAlex ID
 
Property / OpenAlex ID: W2911067482 / rank
 
Normal rank
Property / cites work
 
Property / cites work: Optimal rates for the regularized least-squares algorithm / rank
 
Normal rank
Property / cites work
 
Property / cites work: Deep relaxation: partial differential equations for optimizing deep neural networks / rank
 
Normal rank
Property / cites work
 
Property / cites work: Stochastic gradient algorithm under \((h, \phi)\)-entropy criterion / rank
 
Normal rank
Property / cites work
 
Property / cites work: Learning Theory / rank
 
Normal rank
Property / cites work
 
Property / cites work: Blind source separation using Renyi's \(\alpha\)-marginal entropies. / rank
 
Normal rank
Property / cites work
 
Property / cites work: Consistency analysis of an empirical minimum error entropy algorithm / rank
 
Normal rank
Property / cites work
 
Property / cites work: Learning theory of distributed spectral algorithms / rank
 
Normal rank
Property / cites work
 
Property / cites work: Q4637042 / rank
 
Normal rank
Property / cites work
 
Property / cites work: Thresholded spectral algorithms for sparse approximations / rank
 
Normal rank
Property / cites work
 
Property / cites work: Q5405253 / rank
 
Normal rank
Property / cites work
 
Property / cites work: Regularization schemes for minimum error entropy principle / rank
 
Normal rank
Property / cites work
 
Property / cites work: Convergence of Gradient Descent for Minimum Error Entropy Principle in Linear Regression / rank
 
Normal rank
Property / cites work
 
Property / cites work: Q4637006 / rank
 
Normal rank
Property / cites work
 
Property / cites work: Distributed kernel-based gradient descent algorithms / rank
 
Normal rank
Property / cites work
 
Property / cites work: Optimum bounds for the distributions of martingales in Banach spaces / rank
 
Normal rank
Property / cites work
 
Property / cites work: On the optimality of averaging in distributed statistical learning / rank
 
Normal rank
Property / cites work
 
Property / cites work: Minimum Total Error Entropy Method for Parameter Estimation / rank
 
Normal rank
Property / cites work
 
Property / cites work: The MEE Principle in Data Classification: A Perceptron-Based Analysis / rank
 
Normal rank
Property / cites work
 
Property / cites work: On early stopping in gradient descent learning / rank
 
Normal rank
Property / cites work
 
Property / cites work: Online Pairwise Learning Algorithms / rank
 
Normal rank
Property / cites work
 
Property / cites work: Unregularized online learning algorithms with general loss functions / rank
 
Normal rank
Property / cites work
 
Property / cites work: Divide and Conquer Kernel Ridge Regression: A Distributed Algorithm with Minimax Optimal Rates / rank
 
Normal rank
Property / Wikidata QID
 
Property / Wikidata QID: Q128593977 / rank
 
Normal rank

Latest revision as of 02:12, 15 August 2024

scientific article
Language Label Description Also known as
English
Distributed kernel gradient descent algorithm for minimum error entropy principle
scientific article

    Statements

    Distributed kernel gradient descent algorithm for minimum error entropy principle (English)
    0 references
    0 references
    0 references
    0 references
    27 April 2020
    0 references
    0 references
    0 references
    0 references
    0 references
    distributed learning
    0 references
    minimum error entropy
    0 references
    gradient descent algorithm
    0 references
    kernel method
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references