Learning without Concentration
From MaRDI portal
Publication:2796408
DOI10.1145/2699439zbMath1333.68232arXiv1401.0304OpenAlexW2103775046MaRDI QIDQ2796408
Publication date: 24 March 2016
Published in: Journal of the ACM (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1401.0304
Related Items (68)
On least squares estimation under heteroscedastic and heavy-tailed errors ⋮ Generalization bounds for non-stationary mixing processes ⋮ On aggregation for heavy-tailed classes ⋮ Performance of empirical risk minimization in linear aggregation ⋮ Aggregated hold out for sparse linear regression with a robust loss function ⋮ Simpler PAC-Bayesian bounds for hostile data ⋮ Learning without concentration for general loss functions ⋮ On the geometry of polytopes generated by heavy-tailed random vectors ⋮ Upper bounds on product and multiplier empirical processes ⋮ Low rank matrix recovery from rank one measurements ⋮ Unnamed Item ⋮ Robust statistical learning with Lipschitz and convex loss functions ⋮ Posterior concentration and fast convergence rates for generalized Bayesian learning ⋮ Convergence rates for empirical barycenters in metric spaces: curvature, convexity and extendable geodesics ⋮ Generic error bounds for the generalized Lasso with sub-exponential data ⋮ Sample average approximation with heavier tails. I: Non-asymptotic bounds with weak assumptions and stochastic constraints ⋮ Regularization, sparse recovery, and median-of-means tournaments ⋮ Empirical risk minimization for heavy-tailed losses ⋮ Finite sample behavior of a sieve profile estimator in the single index mode ⋮ A unified approach to uniform signal recovery from nonlinear observations ⋮ Orthogonal statistical learning ⋮ Robust machine learning by median-of-means: theory and practice ⋮ Mean estimation in high dimension ⋮ On the Geometry of Random Polytopes ⋮ Robust classification via MOM minimization ⋮ Stable low-rank matrix recovery via null space properties ⋮ Approximating the covariance ellipsoid ⋮ Relative deviation learning bounds and generalization with unbounded loss functions ⋮ Optimal rates of statistical seriation ⋮ Extending the scope of the small-ball method ⋮ Complex phase retrieval from subgaussian measurements ⋮ Quantized Compressed Sensing: A Survey ⋮ Low-rank matrix recovery via rank one tight frame measurements ⋮ Stable recovery and the coordinate small-ball behaviour of random vectors ⋮ Unnamed Item ⋮ Thin-shell concentration for random vectors in Orlicz balls via moderate deviations and Gibbs measures ⋮ Unnamed Item ⋮ Column normalization of a random measurement matrix ⋮ Slope meets Lasso: improved oracle bounds and optimality ⋮ Regularization and the small-ball method. I: Sparse recovery ⋮ Sparse recovery under weak moment assumptions ⋮ Estimation from nonlinear observations via convex programming with application to bilinear regression ⋮ Learning from MOM's principles: Le Cam's approach ⋮ Unnamed Item ⋮ Variance-based regularization with convex objectives ⋮ Phase retrieval with PhaseLift algorithm ⋮ Approximating \(L_p\) unit balls via random sampling ⋮ Non-Gaussian hyperplane tessellations and robust one-bit compressed sensing ⋮ The geometric median and applications to robust mean estimation ⋮ A MOM-based ensemble method for robustness, subsampling and hyperparameter tuning ⋮ Empirical risk minimization for time series: nonparametric performance bounds for prediction ⋮ Learning with correntropy-induced losses for regression with mixture of symmetric stable noise ⋮ Convergence rates of least squares regression estimators with heavy-tailed errors ⋮ Endpoint Results for Fourier Integral Operators on Noncompact Symmetric Spaces ⋮ Low-rank matrix recovery with composite optimization: good conditioning and rapid convergence ⋮ Stochastic (Approximate) Proximal Point Methods: Convergence, Optimality, and Adaptivity ⋮ Solving equations of random convex functions via anchored regression ⋮ Regularization and the small-ball method II: complexity dependent error rates ⋮ Localized Gaussian width of \(M\)-convex hulls with applications to Lasso and convex aggregation ⋮ Mean estimation and regression under heavy-tailed distributions: A survey ⋮ On Monte-Carlo methods in convex stochastic optimization ⋮ Exact minimax risk for linear least squares, and the lower tail of sample covariance matrices ⋮ Low-Rank Matrix Estimation from Rank-One Projections by Unlifted Convex Optimization ⋮ AdaBoost and robust one-bit compressed sensing ⋮ Proof methods for robust low-rank matrix recovery ⋮ Suboptimality of constrained least squares and improvements via non-linear predictors ⋮ Distribution-free robust linear regression ⋮ Fast Convex Pruning of Deep Neural Networks
Cites Work
- Unnamed Item
- Unnamed Item
- On generic chaining and the smallest singular value of random matrices with heavy tails
- Sparse recovery under weak moment assumptions
- Oracle inequalities in empirical risk minimization and sparse recovery problems. École d'Été de Probabilités de Saint-Flour XXXVIII-2008.
- Empirical processes with a bounded \(\psi_1\) diameter
- Some limit theorems for empirical processes (with discussion)
- Best subset selection, persistence in high-dimensional statistical learning and optimization under \(l_1\) constraint
- Sharper bounds for Gaussian and empirical processes
- Persistene in high-dimensional linear predictor-selection and the virtue of overparametrization
- Weak convergence and empirical processes. With applications to statistics
- \(\ell _{1}\)-regularized linear regression: persistence and oracle inequalities
- On the singular values of random matrices
- Reconstruction and subgaussian operators in asymptotic geometric analysis
- Minimax rate of convergence and the performance of empirical risk minimization in phase recovery
- Local Rademacher complexities
- Concentration Inequalities
- Bounding the Smallest Singular Value of a Random Matrix Without Concentration
- A Remark on the Diameter of Random Sections of Convex Bodies
- Upper and Lower Bounds for Stochastic Processes
This page was built for publication: Learning without Concentration