Oracle inequalities in empirical risk minimization and sparse recovery problems. École d'Été de Probabilités de Saint-Flour XXXVIII-2008.
DOI10.1007/978-3-642-22147-7zbMATH Open1223.91002OpenAlexW648260396MaRDI QIDQ549116FDOQ549116
Authors: Vladimir Koltchinskii
Publication date: 7 July 2011
Published in: Lecture Notes in Mathematics (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/978-3-642-22147-7
Recommendations
- Discussion of ``On concentration for (regularized) empirical risk minimization by Sara van de Geer and Martin Wainwright
- Oracle inequalities for local and global empirical risk minimizers
- Sharp oracle inequalities in low rank estimation
- General nonexact oracle inequalities for classes with a subexponential envelope
- Local Rademacher complexities and oracle inequalities in risk minimization. (2004 IMS Medallion Lecture). (With discussions and rejoinder)
Classification and discrimination; cluster analysis (statistical aspects) (62H30) Estimation in multivariate analysis (62H12) Random matrices (probabilistic aspects) (60B20) Signal detection and filtering (aspects of stochastic processes) (60G35) Research exposition (monographs, survey articles) pertaining to game theory, economics, and finance (91-02) Special problems of linear programming (transportation, multi-index, data envelopment analysis, etc.) (90C08)
Cited In (only showing first 100 items - show all)
- The partial linear model in high dimensions
- Directed Community Detection With Network Embedding
- Bayesian fractional posteriors
- Bounding the expectation of the supremum of empirical processes indexed by Hölder classes
- Low-rank model with covariates for count data with missing values
- Low rank estimation of similarities on graphs
- Outlier detection in networks with missing links
- Dimensionality reduction with subgaussian matrices: a unified theory
- On the prediction loss of the Lasso in the partially labeled setting
- \(L_1\)-penalization in functional linear regression with subgaussian design
- Optimal learning with \textit{Q}-aggregation
- Sparse recovery under weak moment assumptions
- Concentration of the empirical level sets of Tukey's halfspace depth
- On the asymptotic variance of the debiased Lasso
- Censored linear model in high dimensions. Penalised linear regression on high-dimensional data with left-censored response variable
- Regularization and the small-ball method. I: Sparse recovery
- Von Neumann entropy penalization and low-rank matrix estimation
- Sparse and low-rank multivariate Hawkes processes
- Analysis of the generalization error: empirical risk minimization over deep artificial neural networks overcomes the curse of dimensionality in the numerical approximation of Black-Scholes partial differential equations
- An exponential inequality for suprema of empirical processes with heavy tails on the left
- Optimal prediction for high-dimensional functional quantile regression in reproducing kernel Hilbert spaces
- Oracle inequalities for high-dimensional prediction
- Slope meets Lasso: improved oracle bounds and optimality
- Noisy low-rank matrix completion with general sampling distribution
- On oracle inequalities related to data-driven hard thresholding
- Robust machine learning by median-of-means: theory and practice
- Oracle inequalities for the Lasso in the high-dimensional Aalen multiplicative intensity model
- Conditional quantile processes based on series or many regressors
- High-dimensional Ising model selection with Bayesian information criteria
- Honest confidence sets in nonparametric IV regression and other ill-posed models
- Partially linear functional quantile regression in a reproducing kernel Hilbert space
- Rademacher complexity for Markov chains: applications to kernel smoothing and Metropolis-Hastings
- An elementary analysis of ridge regression with random design
- A rank-corrected procedure for matrix completion with fixed basis coefficients
- Estimation of low rank density matrices: bounds in Schatten norms and other distances
- Statistical learning theory and stochastic optimization. Ecole d'Eté de Probabilitiés de Saint-Flour XXXI -- 2001.
- Sparse recovery in convex hulls via entropy penalization
- Surrogate losses in passive and active learning
- Nonparametric estimation of low rank matrix valued function
- Optimal prediction of quantile functional linear regression in reproducing kernel Hilbert spaces
- Approximating polyhedra with sparse inequalities
- Low rank estimation of smooth kernels on graphs
- Estimation from nonlinear observations via convex programming with application to bilinear regression
- U-Processes and Preference Learning
- Title not available (Why is that?)
- Tail index estimation, concentration and adaptivity
- Matrix concentration inequalities via the method of exchangeable pairs
- On concentration for (regularized) empirical risk minimization
- Optimal exponential bounds on the accuracy of classification
- Kullback-Leibler aggregation and misspecified generalized linear models
- Local Rademacher complexity: sharper risk bounds with and without unlabeled samples
- Some lower bounds on sparse outer approximations of polytopes
- Discussion of ``On concentration for (regularized) empirical risk minimization by Sara van de Geer and Martin Wainwright
- High-dimensional model recovery from random sketched data by exploring intrinsic sparsity
- Learning without concentration
- Sharp oracle inequalities in low rank estimation
- General nonexact oracle inequalities for classes with a subexponential envelope
- Cox process functional learning
- Geometric median and robust estimation in Banach spaces
- Permutational Rademacher Complexity
- Approximate nonparametric quantile regression in reproducing kernel Hilbert spaces via random projection
- On the optimality of the empirical risk minimization procedure for the convex aggregation problem
- Sparse high-dimensional semi-nonparametric quantile regression in a reproducing kernel Hilbert space
- Concentration inequalities for statistical inference
- Learning without concentration for general loss functions
- Phase retrieval: stability and recovery guarantees
- Divide-and-conquer for debiased \(l_1\)-norm support vector machine in ultra-high dimensions
- Sparsity in penalized empirical risk minimization
- Oracle inequalities and optimal inference under group sparsity
- Robust matrix completion
- Matrix completion by singular value thresholding: sharp bounds
- Sparse learning for large-scale and high-dimensional data: a randomized convex-concave optimization approach
- Estimating a network from multiple noisy realizations
- The expected norm of a sum of independent random matrices: an elementary approach
- Concentration inequalities for matrix martingales in continuous time
- Estimation and variable selection of quantile partially linear additive models for correlated data
- Locally adaptive sparse additive quantile regression model with TV penalty
- Nonlinear and nonseparable structural functions in regression discontinuity designs with a continuous treatment
- Deep learning based on randomized quasi-Monte Carlo method for solving linear Kolmogorov partial differential equation
- Sample average approximation with heavier tails II: localization in stochastic convex optimization and persistence results for the Lasso
- Communication-efficient estimation of high-dimensional quantile regression
- Branch-and-bound solves random binary IPs in poly\((n)\)-time
- Concentration behavior of the penalized least squares estimator
- Locally simultaneous inference
- Unbiasing and robustifying implied volatility calibration in a cryptocurrency market with large bid-ask spreads and missing quotes
- Minimax rates for conditional density estimation via empirical entropy
- Noisy linear inverse problems under convex constraints: exact risk asymptotics in high dimensions
- Learning finite-dimensional coding schemes with nonlinear reconstruction maps
- Non parametric learning approach to estimate conditional quantiles in the dependent functional data case
- Factorisable multitask quantile regression
- Oracle inequalities for convex loss functions with nonlinear targets
- Statistical performance of quantile tensor regression with convex regularization
- Convergence rate of optimal quantization and application to the clustering performance of the empirical measure
- An introduction to compressed sensing
- Exponential concentration for geometric-median-of-means in non-positive curvature spaces
- Asymptotically faster estimation of high-dimensional additive models using subspace learning
- Title not available (Why is that?)
- Title not available (Why is that?)
- Solving PDEs on spheres with physics-informed convolutional neural networks
- Best subset selection for high-dimensional non-smooth models using iterative hard thresholding
This page was built for publication: Oracle inequalities in empirical risk minimization and sparse recovery problems. École d'Été de Probabilités de Saint-Flour XXXVIII-2008.
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q549116)