Estimation and testing under sparsity. École d'Été de Probabilités de Saint-Flour XLV -- 2015

From MaRDI portal
Publication:276944

DOI10.1007/978-3-319-32774-7zbMath1362.62006OpenAlexW2461502110WikidataQ98840084 ScholiaQ98840084MaRDI QIDQ276944

Sara van de Geer

Publication date: 4 May 2016

Published in: Lecture Notes in Mathematics (Search for Journal in Brave)

Full work available at URL: https://doi.org/10.1007/978-3-319-32774-7



Related Items

The robust desparsified lasso and the focused information criterion for high-dimensional generalized linear models, Covariate-adjusted inference for differential analysis of high-dimensional networks, On cross-validated Lasso in high dimensions, Tuning-free ridge estimators for high-dimensional generalized linear models, Hypothesis testing for high-dimensional multinomials: a selective review, Color Image Inpainting via Robust Pure Quaternion Matrix Completion: Error Bound and Weighted Loss, De-biasing the Lasso with degrees-of-freedom adjustment, Unnamed Item, Penalized least square in sparse setting with convex penalty and non Gaussian errors, Estimating piecewise monotone signals, Unnamed Item, Hierarchical inference for genome-wide association studies: a view on methodology with software, Rejoinder on: ``Hierarchical inference for genome-wide association studies: a view on methodology with software, Generalized linear models with structured sparsity estimators, Statistical guarantees for regularized neural networks, The Lasso with structured design and entropy of (absolute) convex hulls, Correcting for unknown errors in sparse high-dimensional function approximation, Double-estimation-friendly inference for high-dimensional misspecified models, Optimal learning, Regularized regression when covariates are linked on a network: the 3CoSE algorithm, Statistical analysis of sparse approximate factor models, An alternative to synthetic control for models with many covariates under sparsity, Adapting to unknown noise level in sparse deconvolution, ERM and RERM are optimal estimators for regression problems when malicious outliers corrupt the labels, On the total variation regularized estimator over a class of tree graphs, Ill-posed estimation in high-dimensional models with instrumental variables, Unnamed Item, Large numbers of explanatory variables: a probabilistic assessment, On tight bounds for the Lasso, Lasso Inference for High-Dimensional Time Series, Regression in Tensor Product Spaces by the Method of Sieves, Sparse space-time models: concentration inequalities and Lasso, Semiparametric efficiency bounds for high-dimensional models, On the exponentially weighted aggregate with the Laplace prior, Degrees of freedom in submodular regularization: a computational perspective of Stein's unbiased risk estimate, The de-biased group Lasso estimation for varying coefficient models, Logistic regression with total variation regularization, Estimation bounds and sharp oracle inequalities of regularized procedures with Lipschitz loss functions, Sharp Oracle Inequalities for Square Root Regularization, Asymptotic linear expansion of regularized M-estimators, On the asymptotic variance of the debiased Lasso, Unnamed Item, Unnamed Item, Unnamed Item, Unnamed Item, Tensor denoising with trend filtering