Low rank estimation of smooth kernels on graphs
From MaRDI portal
(Redirected from Publication:355091)
Abstract: Let (V,A) be a weighted graph with a finite vertex set V, with a symmetric matrix of nonnegative weights A and with Laplacian . Let be a symmetric kernel defined on the vertex set V. Consider n i.i.d. observations , where are independent random vertices sampled from the uniform distribution in V and is a real valued response variable such that . The goal is to estimate the kernel based on the data and under the assumption that is low rank and, at the same time, smooth on the graph (the smoothness being characterized by discrete Sobolev norms defined in terms of the graph Laplacian). We obtain several results for such problems including minimax lower bounds on the -error and upper bounds for penalized least squares estimators both with nonconvex and with convex penalties.
Recommendations
- Low rank estimation of similarities on graphs
- Kernels and regularization on graphs.
- Estimating a smooth function on a large graph by Bayesian Laplacian regularisation
- Graph kernels
- Graph kernels
- Graph kernels: a survey
- Kernelization using structural parameters on sparse graph classes
- Kernelization using structural parameters on sparse graph classes
Cites work
- scientific article; zbMATH DE number 4044567 (Why is no real title available?)
- scientific article; zbMATH DE number 1254560 (Why is no real title available?)
- Concentration inequalities and model selection. Ecole d'Eté de Probabilités de Saint-Flour XXXIII -- 2003.
- Estimation of high-dimensional low-rank matrices
- Exact matrix completion via convex optimization
- Guaranteed minimum-rank solutions of linear matrix equations via nuclear norm minimization
- Hyper-sparse optimal aggregation
- Introduction to nonparametric estimation
- Low rank estimation of similarities on graphs
- Noisy low-rank matrix completion with general sampling distribution
- Nuclear-norm penalization and optimal rates for noisy low-rank matrix completion
- Oracle inequalities in empirical risk minimization and sparse recovery problems. École d'Été de Probabilités de Saint-Flour XXXVIII-2008.
- Recovering Low-Rank Matrices From Few Coefficients in Any Basis
- Restricted strong convexity and weighted matrix completion: optimal bounds with noise
- The Power of Convex Relaxation: Near-Optimal Matrix Completion
- Tight Oracle Inequalities for Low-Rank Matrix Recovery From a Minimal Number of Noisy Random Measurements
- User-friendly tail bounds for sums of random matrices
Cited in
(7)- A low discrepancy sequence on graphs
- Low rank estimation of similarities on graphs
- Universally consistent vertex classification for latent positions graphs
- Minimax lower bounds for function estimation on graphs
- Estimation of low-rank covariance function
- scientific article; zbMATH DE number 5769858 (Why is no real title available?)
- Estimating a smooth function on a large graph by Bayesian Laplacian regularisation
This page was built for publication: Low rank estimation of smooth kernels on graphs
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q355091)