The following pages link to Sub-Gaussian mean estimators (Q510694):
Displayed 44 items.
- Simpler PAC-Bayesian bounds for hostile data (Q1640576) (← links)
- Robust regression using biased objectives (Q1698865) (← links)
- Sub-Gaussian estimators of the mean of a random vector (Q1731055) (← links)
- Sub-Gaussian estimators of the mean of a random matrix with heavy-tailed entries (Q1991680) (← links)
- Solvable integration problems and optimal sample size selection (Q2001207) (← links)
- Learning from MOM's principles: Le Cam's approach (Q2010482) (← links)
- K-bMOM: A robust Lloyd-type clustering algorithm based on bootstrap median-of-means (Q2072412) (← links)
- Finite sample properties of parametric MMD estimation: robustness to misspecification and dependence (Q2073208) (← links)
- Robust and efficient mean estimation: an approach based on the properties of self-normalized sums (Q2074319) (← links)
- Robust parameter estimation of regression models under weakened moment assumptions (Q2081782) (← links)
- Distribution-free robust linear regression (Q2113267) (← links)
- Robust sub-Gaussian estimation of a mean vector in nearly linear time (Q2119240) (← links)
- All-in-one robust estimator of the Gaussian mean (Q2131271) (← links)
- Concentration study of M-estimators using the influence function (Q2154967) (← links)
- Optimal robust mean and location estimation via convex programs with respect to any pseudo-norms (Q2159256) (← links)
- Robust statistical learning with Lipschitz and convex loss functions (Q2174664) (← links)
- Confidence regions and minimax rates in outlier-robust estimation on the probability simplex (Q2192314) (← links)
- Robust machine learning by median-of-means: theory and practice (Q2196199) (← links)
- Mean estimation with sub-Gaussian rates in polynomial time (Q2196216) (← links)
- Robust inference via multiplier bootstrap (Q2196240) (← links)
- Robust classification via MOM minimization (Q2203337) (← links)
- Robust modifications of U-statistics and applications to covariance estimation problems (Q2278677) (← links)
- Distributed statistical estimation and rates of convergence in normal approximation (Q2283576) (← links)
- Algorithms of robust stochastic optimization based on mirror descent method (Q2289049) (← links)
- User-friendly covariance estimation for heavy-tailed distributions (Q2292396) (← links)
- Convergence rates of least squares regression estimators with heavy-tailed errors (Q2313287) (← links)
- Efficient learning with robust gradient descent (Q2320583) (← links)
- The breakdown point of the median of means tournament (Q2322677) (← links)
- Multidimensional linear functional estimation in sparse Gaussian models and robust estimation of the mean (Q2323942) (← links)
- Mean estimation and regression under heavy-tailed distributions: A survey (Q2329044) (← links)
- Near-optimal mean estimators with respect to general norms (Q2334371) (← links)
- Robust multivariate mean estimation: the optimality of trimmed mean (Q2656601) (← links)
- Adaptive Huber Regression (Q3304852) (← links)
- A New Principle for Tuning-Free Huber Regression (Q5037807) (← links)
- Comment on “A Tuning-Free Robust and Efficient Approach to High-Dimensional Regression” (Q5146023) (← links)
- (Q5149262) (← links)
- Deconvolution for some singular density errors via a combinatorial median of means approach (Q6062700) (← links)
- Adaptive robust large volatility matrix estimation based on high-frequency financial data (Q6090556) (← links)
- Rate-optimal robust estimation of high-dimensional vector autoregressive models (Q6117053) (← links)
- Topics in robust statistical learning (Q6124899) (← links)
- Nonlinear Consensus+Innovations under Correlated Heavy-Tailed Noises: Mean Square Convergence Rate and Asymptotics (Q6148453) (← links)
- Catoni-style confidence sequences for heavy-tailed mean estimation (Q6171648) (← links)
- Robust supervised learning with coordinate gradient descent (Q6172182) (← links)
- Mean estimation in high dimension (Q6200221) (← links)