Learning theory of distributed spectral algorithms

From MaRDI portal
Publication:5348011

DOI10.1088/1361-6420/aa72b2zbMath1372.65162OpenAlexW2613940844MaRDI QIDQ5348011

Ding-Xuan Zhou, Zheng-Chu Guo, Shao-Bo Lin

Publication date: 11 August 2017

Published in: Inverse Problems (Search for Journal in Brave)

Full work available at URL: https://doi.org/10.1088/1361-6420/aa72b2



Related Items

Distributed spectral pairwise ranking algorithms, Distributed regression learning with coefficient regularization, Deep distributed convolutional neural networks: Universality, Distributed learning via filtered hyperinterpolation on manifolds, Unnamed Item, Unnamed Item, Gradient descent for robust kernel-based regression, Distributed learning with partial coefficients regularization, Manifold regularization based on Nyström type subsampling, Distributed kernel gradient descent algorithm for minimum error entropy principle, Multi-task learning via linear functional strategy, Distributed semi-supervised regression learning with coefficient regularization, Averaging versus voting: a comparative study of strategies for distributed classification, Distributed learning with multi-penalty regularization, Preface for Inverse Problems special issue on learning and inverse problems, Spectral algorithms for learning with dependent observations, Capacity dependent analysis for functional online learning algorithms, Distributed learning for sketched kernel regression, Tikhonov regularization with oversmoothing penalty for nonlinear statistical inverse problems, Convex regularization in statistical inverse learning problems, Domain Generalization by Functional Regression, Inverse learning in Hilbert scales, Estimates on learning rates for multi-penalty distribution regression, Coefficient-based regularized distribution regression, Online regularized learning algorithm for functional data, Communication-efficient estimation of high-dimensional quantile regression, Distributed learning and distribution regression of coefficient regularization, Kernel regression, minimax rates and effective dimensionality: Beyond the regular case, On the Improved Rates of Convergence for Matérn-Type Kernel Ridge Regression with Application to Calibration of Computer Models, Coefficient-based regularization network with variance loss for error, Robust kernel-based distribution regression, Unnamed Item, Partially functional linear regression with quadratic regularization, Balancing principle in supervised learning for a general regularization scheme, Convergence of online mirror descent, Optimal learning rates for distribution regression, Distributed estimation of principal eigenspaces, Analysis of regularized least squares for functional linear regression model, Universality of deep convolutional neural networks, Optimal rates for spectral algorithms with least-squares regression over Hilbert spaces, Regularized Nyström subsampling in regression and ranking problems under general smoothness assumptions, Semi-supervised learning with summary statistics, Distributed learning with indefinite kernels, Unnamed Item, Optimal rates for coefficient-based regularized regression, Distributed Filtered Hyperinterpolation for Noisy Data on the Sphere, Unnamed Item, Unnamed Item, Unnamed Item, Distributed least squares prediction for functional linear regression*



Cites Work