A unified framework for high-dimensional analysis of \(M\)-estimators with decomposable regularizers
From MaRDI portal
Publication:5965308
DOI10.1214/12-STS400zbMath1331.62350arXiv1010.2731MaRDI QIDQ5965308
Bin Yu, Martin J. Wainwright, Sahand N. Negahban, Pradeep Ravikumar
Publication date: 3 March 2016
Published in: Statistical Science (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1010.2731
\(M\)-estimatorsparsitygroup Lasso\(\ell_{1}\)-regularizationhigh-dimensional statisticsLassonuclear norm
Related Items (only showing first 100 items - show all)
Bayesian vector heterogeneous autoregressive modelling ⋮ Robust transfer learning of high-dimensional generalized linear model ⋮ Multiple change points detection in high-dimensional multivariate regression ⋮ Generalized linear models with structured sparsity estimators ⋮ Rejoinder to “Reader reaction to ‘Outcome‐adaptive Lasso: Variable selection for causal inference’ by Shortreed and Ertefaie (2017)” ⋮ Multivariate functional response low‐rank regression with an application to brain imaging data ⋮ Oracle inequality for sparse trace regression models with exponential \(\beta\)-mixing errors ⋮ Inference on heterogeneous treatment effects in high‐dimensional dynamic panels under weak dependence ⋮ High-Dimensional Gaussian Graphical Regression Models with Covariates ⋮ Concave Likelihood-Based Regression with Finite-Support Response Variables ⋮ Penalized wavelet nonparametric univariate logistic regression for irregular spaced data ⋮ The rate of convergence for sparse and low-rank quantile trace regression ⋮ Sparse and Low-Rank Matrix Quantile Estimation With Application to Quadratic Regression ⋮ Integrative sparse reduced-rank regression via orthogonal rotation for analysis of high-dimensional multi-source data ⋮ Sparse Laplacian shrinkage for nonparametric transformation survival model ⋮ Model-Assisted Uniformly Honest Inference for Optimal Treatment Regimes in High Dimension ⋮ Correlation Tensor Decomposition and Its Application in Spatial Imaging Data ⋮ Honest Confidence Sets for High-Dimensional Regression by Projection and Shrinkage ⋮ Profile GMM estimation of panel data models with interactive fixed effects ⋮ A Likelihood-Based Approach for Multivariate Categorical Response Regression in High Dimensions ⋮ Individual Data Protected Integrative Regression Analysis of High-Dimensional Heterogeneous Data ⋮ Rate-optimal robust estimation of high-dimensional vector autoregressive models ⋮ Robust matrix estimations meet Frank-Wolfe algorithm ⋮ Distributed Decoding From Heterogeneous 1-Bit Compressive Measurements ⋮ A convex-Nonconvex strategy for grouped variable selection ⋮ Post-selection Inference of High-dimensional Logistic Regression Under Case–Control Design ⋮ The nonparametric Box-Cox model for high-dimensional regression analysis ⋮ Multi-Task Learning with High-Dimensional Noisy Images ⋮ Robust High-Dimensional Regression with Coefficient Thresholding and Its Application to Imaging Data Analysis ⋮ A network Lasso model for regression ⋮ Cross-Fitted Residual Regression for High-Dimensional Heteroscedasticity Pursuit ⋮ Two-stage communication-efficient distributed sparse M-estimation with missing data ⋮ Variable selection and regularization via arbitrary rectangle-range generalized elastic net ⋮ A unified precision matrix estimation framework via sparse column-wise inverse operator under weak sparsity ⋮ The Lasso with general Gaussian designs with applications to hypothesis testing ⋮ Carving model-free inference ⋮ Envelopes and principal component regression ⋮ Multiple Change Point Detection in Reduced Rank High Dimensional Vector Autoregressive Models ⋮ Approximate Selective Inference via Maximum Likelihood ⋮ Statistical performance of quantile tensor regression with convex regularization ⋮ Expectile trace regression via low-rank and group sparsity regularization ⋮ Distributed estimation and inference for spatial autoregression model with large scale networks ⋮ High-dimensional functional graphical model structure learning via neighborhood selection approach ⋮ A joint estimation for the high-dimensional regression modeling on stratified data ⋮ Machine Learning Time Series Regressions With an Application to Nowcasting ⋮ Analysis of global and local optima of regularized quantile regression in high dimensions: a subgradient approach ⋮ More communication-efficient distributed sparse learning ⋮ Direct covariance matrix estimation with compositional data ⋮ On the Use of Minimum Penalties in Statistical Learning ⋮ A fast trans-lasso algorithm with penalized weighted score function ⋮ D4R: doubly robust reduced rank regression in high dimension ⋮ Trustworthy regularized huber regression for outlier detection ⋮ Estimation and Inference for High-Dimensional Generalized Linear Models with Knowledge Transfer ⋮ Estimation of Linear Functionals in High-Dimensional Linear Models: From Sparsity to Nonsparsity ⋮ A generalized formulation for group selection via ADMM ⋮ Efficient variable selection for high-dimensional multiplicative models: a novel LPRE-based approach ⋮ Multiresolution categorical regression for interpretable cell-type annotation ⋮ A naïve Bayes regularized logistic regression estimator for low-dimensional classification ⋮ High-dimensional data segmentation in regression settings permitting temporal dependence and non-Gaussianity ⋮ Adaptive Huber trace regression with low-rank matrix parameter via nonconvex regularization ⋮ Weighted likelihood transfer learning for high-dimensional generalized linear models ⋮ Multivariate log-contrast regression with sub-compositional predictors: testing the association between preterm infants' gut microbiome and neurobehavioral outcomes ⋮ Uniform recovery guarantees for quantized corrupted sensing using structured or generative priors ⋮ Inference in Approximately Sparse Correlated Random Effects Probit Models With Panel Data ⋮ Sparse group regularization for semi-continuous transportation data ⋮ Matrix Linear Discriminant Analysis ⋮ The statistical rate for support matrix machines under low rankness and row (column) sparsity ⋮ Asymptotically faster estimation of high-dimensional additive models using subspace learning ⋮ Structure learning for continuous time Bayesian networks via penalized likelihood ⋮ A Regression-Based Approach to Robust Estimation and Inference for Genetic Covariance ⋮ A Decorrelating and Debiasing Approach to Simultaneous Inference for High-Dimensional Confounded Models ⋮ Online Regularization toward Always-Valid High-Dimensional Dynamic Pricing ⋮ A theory of optimal convex regularization for low-dimensional recovery ⋮ Scenario-based quantile connectedness of the U.S. interbank liquidity risk network ⋮ Nuclear norm regularized quantile regression with interactive fixed effects ⋮ Fully polynomial-time randomized approximation schemes for global optimization of high-dimensional minimax concave penalized generalized linear models ⋮ Statistical inference for high-dimensional linear regression with blockwise missing data ⋮ Worst possible sub-directions in high-dimensional models ⋮ Covariate-adjusted inference for differential analysis of high-dimensional networks ⋮ A shrinkage principle for heavy-tailed data: high-dimensional robust low-rank matrix recovery ⋮ Nonnegative-Lasso and application in index tracking ⋮ Penalized and constrained LAD estimation in fixed and high dimension ⋮ A general family of trimmed estimators for robust high-dimensional data analysis ⋮ An analysis of penalized interaction models ⋮ A self-calibrated direct approach to precision matrix estimation and linear discriminant analysis in high dimensions ⋮ Sparse recovery via nonconvex regularized \(M\)-estimators over \(\ell_q\)-balls ⋮ Minimum distance Lasso for robust high-dimensional regression ⋮ Testability of high-dimensional linear models with nonsparse structures ⋮ Adaptive log-density estimation ⋮ Exact post-selection inference, with application to the Lasso ⋮ Comprehensive comparative analysis and identification of RNA-binding protein domains: multi-class classification and feature selection ⋮ Doubly robust semiparametric inference using regularized calibrated estimation with high-dimensional data ⋮ Regularized high dimension low tubal-rank tensor regression ⋮ Post-model-selection inference in linear regression models: an integrated review ⋮ Joint estimation of precision matrices in heterogeneous populations ⋮ Geometric inference for general high-dimensional linear inverse problems ⋮ Oracle inequalities, variable selection and uniform inference in high-dimensional correlated random effects panel data models ⋮ Sharp MSE bounds for proximal denoising ⋮ Distributed testing and estimation under sparse high dimensional models ⋮ Penalized least square in sparse setting with convex penalty and non Gaussian errors
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Estimation of high-dimensional low-rank matrices
- Estimation of (near) low-rank matrices with noise and high-dimensional scaling
- Sparsity in multiple kernel learning
- Oracle inequalities and optimal inference under group sparsity
- Characteristic vectors of bordered matrices with infinite dimensions
- A note on the Lasso for Gaussian graphical model selection
- The sparsity and bias of the LASSO selection in high-dimensional linear regression
- High-dimensional Ising model selection using \(\ell _{1}\)-regularized logistic regression
- The benefit of group sparsity
- Optimal rates of convergence for covariance matrix estimation
- Covariance regularization by thresholding
- Operator norm consistent estimation of large-dimensional sparse covariance matrices
- Lasso-type recovery of sparse representations for high-dimensional data
- Algorithms for simultaneous sparse approximation. I: Greedy pursuit
- High-dimensional additive modeling
- Sparsistency and rates of convergence in large covariance matrix estimation
- The composite absolute penalties family for grouped and hierarchical variable selection
- Persistene in high-dimensional linear predictor-selection and the virtue of overparametrization
- Statistical analysis of observations of increasing dimension. Transl. from the Russian
- Sparse permutation invariant covariance estimation
- On the asymptotic properties of the group lasso estimator for linear models
- Honest variable selection in linear and logistic regression models via \(\ell _{1}\) and \(\ell _{1}+\ell _{2}\) penalization
- On the conditions used to prove oracle results for the Lasso
- Self-concordant analysis for logistic regression
- High-dimensional covariance estimation by minimizing \(\ell _{1}\)-penalized log-determinant divergence
- Two proposals for robust PCA using semidefinite programming
- Simultaneous analysis of Lasso and Dantzig selector
- High-dimensional generalized linear models and the lasso
- Sparsity oracle inequalities for the Lasso
- Support union recovery in high-dimensional multivariate regression
- Aggregation for Gaussian regression
- The Dantzig selector: statistical estimation when \(p\) is much larger than \(n\). (With discussions and rejoinder).
- High-dimensional graphs and variable selection with the Lasso
- Exact matrix completion via convex optimization
- Reconstruction From Anisotropic Random Measurements
- Rank-Sparsity Incoherence for Matrix Decomposition
- Decoding by Linear Programming
- An overview of recent developments in genomics and associated statistical methods
- Interior-Point Method for Nuclear Norm Approximation with Application to System Identification
- Guaranteed Minimum-Rank Solutions of Linear Matrix Equations via Nuclear Norm Minimization
- Atomic Decomposition by Basis Pursuit
- On the Reconstruction of Block-Sparse Signals With an Optimal Number of Measurements
- Sparsity and Smoothness Via the Fused Lasso
- Information-Theoretic Limits on Sparsity Recovery in the High-Dimensional and Noisy Setting
- Sharp Thresholds for High-Dimensional and Noisy Sparsity Recovery Using $\ell _{1}$-Constrained Quadratic Programming (Lasso)
- Robust PCA via Outlier Pursuit
- Robust Matrix Decomposition With Sparse Corruptions
- Minimax Rates of Estimation for High-Dimensional Linear Regression Over $\ell_q$-Balls
- Simultaneous Support Recovery in High Dimensions: Benefits and Perils of Block $\ell _{1}/\ell _{\infty} $-Regularization
- Model-Based Compressive Sensing
- Neighborliness of randomly projected simplices in high dimensions
- A Simpler Approach to Matrix Completion
- Minimax-optimal rates for sparse additive models over kernel classes via convex programming
- Restricted strong convexity and weighted matrix completion: Optimal bounds with noise
- Model Selection and Estimation in Regression with Grouped Variables
- Compressed sensing
- A unified framework for high-dimensional analysis of \(M\)-estimators with decomposable regularizers
This page was built for publication: A unified framework for high-dimensional analysis of \(M\)-estimators with decomposable regularizers