Square-root lasso: pivotal recovery of sparse signals via conic programming
DOI10.1093/BIOMET/ASR043zbMATH Open1228.62083arXiv1009.5689OpenAlexW3121832289MaRDI QIDQ3107973FDOQ3107973
Authors: A. Belloni, Victor Chernozhukov, Lie Wang
Publication date: 28 December 2011
Published in: Biometrika (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1009.5689
Recommendations
- Square-root Lasso for high-dimensional sparse linear systems with weakly dependent errors
- Pivotal estimation via square-root lasso in nonparametric regression
- Improved bounds for square-root Lasso and square-root slope
- Pivotal Estimation in High-Dimensional Regression via Linear Programming
- Sharp oracle inequalities for square root regularization
Asymptotic properties of parametric estimators (62F12) Linear regression; mixed models (62J05) Applications of mathematical programming (90C90)
Cited In (only showing first 100 items - show all)
- Adaptive estimation of high-dimensional signal-to-noise ratios
- A study on tuning parameter selection for the high-dimensional lasso
- High-dimensional inference: confidence intervals, \(p\)-values and R-software \texttt{hdi}
- Sparse identification of posynomial models
- The partial linear model in high dimensions
- Simplex QP-based methods for minimizing a conic quadratic objective over polyhedra
- Sparse high-dimensional regression: exact scalable algorithms and phase transitions
- A self-calibrated direct approach to precision matrix estimation and linear discriminant analysis in high dimensions
- Robust subspace clustering
- Accuracy assessment for high-dimensional linear regression
- Sign-constrained least squares estimation for high-dimensional regression
- Selective inference with a randomized response
- A general family of trimmed estimators for robust high-dimensional data analysis
- Non-negative least squares for high-dimensional linear models: consistency and sparse recovery without regularization
- A knockoff filter for high-dimensional selective inference
- Tuning-free heterogeneous inference in massive networks
- Uniformly valid post-regularization confidence regions for many functional parameters in z-estimation framework
- Linear hypothesis testing in dense high-dimensional linear models
- Greedy variance estimation for the LASSO
- Oracle inequalities for high-dimensional prediction
- Regularization for high-dimensional covariance matrix
- Optimal sparsity testing in linear regression model
- Penalised robust estimators for sparse and high-dimensional linear models
- Noisy low-rank matrix completion with general sampling distribution
- A tuning-free robust and efficient approach to high-dimensional regression
- An inexact augmented Lagrangian method for second-order cone programming with applications
- Proximal alternating penalty algorithms for nonsmooth constrained convex optimization
- Stein's method for nonlinear statistics: a brief survey and recent progress
- On estimation of the diagonal elements of a sparse precision matrix
- Joint estimation and variable selection for mean and dispersion in proper dispersion models
- Prediction error bounds for linear regression with the TREX
- Perspective functions: proximal calculus and applications in high-dimensional statistics
- Econometric estimation with high-dimensional moment equalities
- Generalization of constraints for high dimensional regression problems
- Self-normalization: taming a wild population in a heavy-tailed world
- Simultaneous feature selection and clustering based on square root optimization
- Sharp MSE bounds for proximal denoising
- High-dimensional inference robust to outliers with ℓ1-norm penalization
- A projection-based conditional dependence measure with applications to high-dimensional undirected graphical models
- Non-Convex Global Minimization and False Discovery Rate Control for the TREX
- Confidence intervals for high-dimensional inverse covariance estimation
- Group inference in high dimensions with applications to hierarchical testing
- Asymptotic normality and optimalities in estimation of large Gaussian graphical models
- Gaussian graphical model estimation with false discovery rate control
- Honest confidence regions and optimality in high-dimensional precision matrix estimation
- Goodness-of-Fit Tests for High Dimensional Linear Models
- On asymptotically optimal confidence regions and tests for high-dimensional models
- Optimal Estimation of Genetic Relatedness in High-Dimensional Linear Models
- The benefit of group sparsity in group inference with de-biased scaled group Lasso
- Testing endogeneity with high dimensional covariates
- Global-local mixtures: a unifying framework
- High-dimensional inference in misspecified linear models
- Confidence Intervals for Low Dimensional Parameters in High Dimensional Linear Models
- Inference robust to outliers with \(\ell_1\)-norm penalization
- Oracle inequalities for high dimensional vector autoregressions
- Group penalized quantile regression
- Double Machine Learning for Partially Linear Mixed-Effects Models with Repeated Measurements
- A lava attack on the recovery of sums of dense and sparse signals
- Pivotal estimation via square-root lasso in nonparametric regression
- Worst possible sub-directions in high-dimensional models
- Robust Wasserstein profile inference and applications to machine learning
- Solution paths of variational regularization methods for inverse problems
- The \(L_1\) penalized LAD estimator for high dimensional linear regression
- Significance testing in non-sparse high-dimensional linear models
- An efficient semismooth Newton method for adaptive sparse signal recovery problems
- High-dimensional regression with potential prior information on variable importance
- A general theory of concave regularization for high-dimensional sparse estimation problems
- High-dimensional regression with unknown variance
- Ridge regression and asymptotic minimax estimation over spheres of growing dimension
- Adaptive smoothing algorithms for nonsmooth composite convex minimization
- A fast trans-lasso algorithm with penalized weighted score function
- Multi-block alternating direction method of multipliers for ultrahigh dimensional quantile fused regression
- An off-the-grid approach to multi-compartment magnetic resonance fingerprinting
- A two-stage regularization method for variable selection and forecasting in high-order interaction model
- A phase transition for finding needles in nonlinear haystacks with LASSO artificial neural networks
- A fast and effective algorithm for sparse linear regression with \(\ell_p\)-norm data fidelity and elastic net regularization
- Recovery of sums of sparse and dense signals by incorporating graphical structure among predictors
- Group sparse recovery via group square-root elastic net and the iterative multivariate thresholding-based algorithm
- Finite-sample analysis of \(M\)-estimators using self-concordance
- Sharp oracle inequalities for square root regularization
- Noise covariance estimation in multi-task high-dimensional linear models
- Sparse Convoluted Rank Regression in High Dimensions
- Regularized estimation in sparse high-dimensional multivariate regression, with application to a DNA methylation study
- Adapting to unknown noise level in sparse deconvolution
- Title not available (Why is that?)
- Online Debiasing for Adaptively Collected High-Dimensional Data With Applications to Time Series Analysis
- Sparse HP filter: finding kinks in the COVID-19 contact rate
- A smooth primal-dual optimization framework for nonsmooth composite convex minimization
- Group sparse structural smoothing recovery: model, statistical properties and algorithm
- Inference for high‐dimensional linear models with locally stationary error processes
- Robust oracle estimation and uncertainty quantification for possibly sparse quantiles
- Optimal learning
- Optimal designs in sparse linear models
- Smooth over-parameterized solvers for non-smooth structured optimization
- Square root LASSO: well-posedness, Lipschitz stability, and the tuning trade-off
- Robust and tuning-free sparse linear regression via square-root slope
- Improved bounds for square-root Lasso and square-root slope
- Penalized and constrained LAD estimation in fixed and high dimension
- Stable local-smooth principal component pursuit
- A unified convergence rate analysis of the accelerated smoothed gap reduction algorithm
This page was built for publication: Square-root lasso: pivotal recovery of sparse signals via conic programming
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q3107973)