Adaptive Dantzig density estimation
DOI10.1214/09-AIHP351zbMATH Open1207.62077arXiv0905.0884MaRDI QIDQ629798FDOQ629798
Authors: Karine Bertin, E. Le Pennec, V. Rivoirard
Publication date: 10 March 2011
Published in: Annales de l'Institut Henri Poincaré. Probabilités et Statistiques (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/0905.0884
Recommendations
density estimationsparsitycalibrationoracle inequalitiesconcentration inequalitiesdictionaryDantzig estimatelasso estimate
Nonparametric estimation (62G05) Density estimation (62G07) Asymptotic properties of nonparametric inference (62G20)
Cites Work
- The Adaptive Lasso and Its Oracle Properties
- Least angle regression. (With discussion)
- Ideal spatial adaptation by wavelet shrinkage
- Title not available (Why is that?)
- Lasso-type recovery of sparse representations for high-dimensional data
- Simultaneous analysis of Lasso and Dantzig selector
- High-dimensional generalized linear models and the lasso
- The Dantzig selector: statistical estimation when \(p\) is much larger than \(n\). (With discussions and rejoinder).
- High-dimensional graphs and variable selection with the Lasso
- Title not available (Why is that?)
- Asymptotics for Lasso-type estimators.
- Sparsity oracle inequalities for the Lasso
- The sparsity and bias of the LASSO selection in high-dimensional linear regression
- Stable recovery of sparse overcomplete representations in the presence of noise
- A new approach to variable selection in least squares problems
- Aggregation for Gaussian regression
- Minimal penalties for Gaussian model selection
- Concentration inequalities and model selection. Ecole d'Eté de Probabilités de Saint-Flour XXXIII -- 2003.
- Aggregation and Sparsity Via ℓ1 Penalized Least Squares
- Sparse Density Estimation with ℓ1 Penalties
- On minimax density estimation on \(\mathbb R\)
- Adaptive density estimation: A curse of support?
- Sup-norm convergence rate and sign concentration property of Lasso and Dantzig estimators
- Atomic decomposition by basis pursuit
- Near-ideal model selection by \(\ell _{1}\) minimization
- SPADES and mixture models
- Near optimal thresholding estimation of a Poisson intensity on the real line
Cited In (21)
- Adaptive estimation of the baseline hazard function in the Cox model by model selection, with high-dimensional covariates
- Penalized logspline density estimation using total variation penalty
- Estimator selection: a new method with applications to kernel density estimation
- A class of adaptive distribution-free procedures
- Optimal Kullback-Leibler aggregation in mixture density estimation by maximum likelihood
- Optimal adaptive estimation of the relative density
- Some rates of convergence for the selected Lasso estimator
- A simple adaptive estimator of the integrated square of a density
- Oracle inequalities for the Lasso in the high-dimensional Aalen multiplicative intensity model
- Adaptive estimation of the mode of a multivariate density
- Compressive Gaussian mixture estimation
- Sparse recovery from extreme eigenvalues deviation inequalities
- Compressive statistical learning with random feature moments
- Lasso and probabilistic inequalities for multivariate point processes
- Lasso-type estimators for semiparametric nonlinear mixed-effects models estimation
- Lasso in Infinite dimension: application to variable selection in functional multivariate linear regression
- High-dimensional additive hazards models and the lasso
- Adaptive estimation in the nonparametric random coefficients binary choice model by needlet thresholding
- A non-asymptotic approach for model selection via penalization in high-dimensional mixture of experts models
- Adaptive density estimation based on a mixture of gammas
- The Dantzig selector and sparsity oracle inequalities
Uses Software
This page was built for publication: Adaptive Dantzig density estimation
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q629798)