Understanding Implicit Regularization in Over-Parameterized Single Index Model
From MaRDI portal
Publication:6185498
DOI10.1080/01621459.2022.2044824arXiv2007.08322MaRDI QIDQ6185498
Zhuoran Yang, Jianqing Fan, Unnamed Author
Publication date: 8 January 2024
Published in: Journal of the American Statistical Association (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/2007.08322
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Nearly unbiased variable selection under minimax concave penalty
- Estimation of high-dimensional low-rank matrices
- A shrinkage principle for heavy-tailed data: high-dimensional robust low-rank matrix recovery
- Empirical risk minimization for heavy-tailed losses
- The restricted isometry property and its implications for compressed sensing
- Lasso, fractional norm and structured sparse estimation using a Hadamard product parametrization
- Robust covariance estimation for approximate factor models
- Challenging the empirical mean and empirical variance: a deviation study
- Sub-Gaussian estimators of the mean of a random matrix with heavy-tailed entries
- A selective overview of deep learning
- Robust high-dimensional factor models with applications to statistical machine learning
- Robust modifications of U-statistics and applications to covariance estimation problems
- User-friendly covariance estimation for heavy-tailed distributions
- On model selection consistency of regularized M-estimators
- Large covariance estimation through elliptical factor models
- Strong oracle optimality of folded concave penalized estimation
- L1-Regularized Least Squares for Support Recovery of High Dimensional Single Index Models with Gaussian Designs
- The Generalized Lasso With Non-Linear Observations
- Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties
- High-dimensional estimation with geometric constraints: Table 1.
- Structured Signal Recovery From Non-Linear and Heavy-Tailed Measurements
- Graph-Dependent Implicit Regularisation for Distributed Stochastic Subgradient Descent
- Non-Gaussian observations in nonlinear compressed sensing via Stein discrepancies
- Minimax Rates of Estimation for High-Dimensional Linear Regression Over $\ell_q$-Balls
- Phase Retrieval via Matrix Completion
- Introduction to nonparametric estimation
- Compressed sensing
This page was built for publication: Understanding Implicit Regularization in Over-Parameterized Single Index Model