The Dantzig selector and sparsity oracle inequalities

From MaRDI portal
Publication:605023

DOI10.3150/09-BEJ187zbMath1452.62486arXiv0909.0861OpenAlexW2005838333WikidataQ105584437 ScholiaQ105584437MaRDI QIDQ605023

Vladimir I. Koltchinskii

Publication date: 12 November 2010

Published in: Bernoulli (Search for Journal in Brave)

Full work available at URL: https://arxiv.org/abs/0909.0861




Related Items

Regularity properties for sparse regressionCensored linear model in high dimensions. Penalised linear regression on high-dimensional data with left-censored response variableSome sharp performance bounds for least squares regression with \(L_1\) regularizationThe benefit of group sparsity in group inference with de-biased scaled group LassoSimultaneous analysis of Lasso and Dantzig selectorRecovery error analysis of noisy measurement in compressed sensingOracle inequalities for the lasso in the Cox modelEstimation and variable selection in partial linear single index models with error-prone linear covariatesStatistical significance in high-dimensional linear modelsSparse recovery under matrix uncertaintyRelaxed sparse eigenvalue conditions for sparse estimation via non-convex regularized regressionHigh-dimensional covariance matrix estimation with missing observationsSample average approximation with heavier tails II: localization in stochastic convex optimization and persistence results for the Lasso\(\ell_{1}\)-penalization for mixture regression modelsSparsity in multiple kernel learningAnalysis of sparse recovery for Legendre expansions using envelope boundLasso in Infinite dimension: application to variable selection in functional multivariate linear regressionSup-norm convergence rate and sign concentration property of Lasso and Dantzig estimatorsExpectile trace regression via low-rank and group sparsity regularizationGoodness-of-Fit Tests for High Dimensional Linear ModelsHigh-dimensional additive hazards models and the lassoThe Lasso problem and uniquenessOn the conditions used to prove oracle results for the LassoPAC-Bayesian bounds for sparse regression estimation with exponential weightsThe adaptive and the thresholded Lasso for potentially misspecified models (and a lower bound for the Lasso)Mirror averaging with sparsity priorsTransductive versions of the Lasso and the Dantzig selectorGeneral nonexact oracle inequalities for classes with a subexponential envelopeRegularization for Cox's proportional hazards model with NP-dimensionalityGeneralization of constraints for high dimensional regression problemsGaussian approximations and multiplier bootstrap for maxima of sums of high-dimensional random vectorsNuclear-norm penalization and optimal rates for noisy low-rank matrix completionFactor models and variable selection in high-dimensional regression analysisA new perspective on least squares under convex constraint\(L_1\)-penalization in functional linear regression with subgaussian designNormalized and standard Dantzig estimators: two approachesOracle inequalities for the Lasso in the additive hazards model with interval-censored dataHigh-dimensional variable screening and bias in subsequent inference, with an empirical comparisonExponential screening and optimal rates of sparse estimationConfidence Intervals for Low Dimensional Parameters in High Dimensional Linear ModelsQuasi-likelihood and/or robust estimation in high dimensionsA selective review of group selection in high-dimensional modelsA general theory of concave regularization for high-dimensional sparse estimation problemsAsymptotic normality and optimalities in estimation of large Gaussian graphical modelsComments on: \(\ell _{1}\)-penalization for mixture regression modelsSparse recovery in convex hulls via entropy penalizationSimulation-based Value-at-Risk for nonlinear portfoliosThe Partial Linear Model in High DimensionsWeaker regularity conditions and sparse recovery in high-dimensional regression



Cites Work