scientific article; zbMATH DE number 7164718
From MaRDI portal
Publication:5214207
zbMath1434.68389arXiv1703.08619MaRDI QIDQ5214207
Simon Bussy, Agathe Guilloux, Mokhtar Z. Alaya, Stéphane Gaïffas
Publication date: 7 February 2020
Full work available at URL: https://arxiv.org/abs/1703.08619
Title: zbMATH Open Web Interface contents unavailable due to conflicting licenses.
supervised learningoracle inequalitiesproximal methodstotal-variationfeatures binarizationsparse additive modeling
Asymptotic properties of parametric estimators (62F12) Learning and adaptive systems in artificial intelligence (68T05)
Related Items (3)
Binacox: automatic cut‐point detection in high‐dimensional Cox model with applications in genetics ⋮ ESTIMATION OF A HIGH-DIMENSIONAL COUNTING PROCESS WITHOUT PENALTY FOR HIGH-FREQUENCY EVENTS ⋮ Multiple criteria sorting models and methods. I: Survey of the literature
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Kullback-Leibler aggregation and misspecified generalized linear models
- On the prediction performance of the Lasso
- Statistics for high-dimensional data. Methods, theory and applications.
- Enhancing sparsity by reweighted \(\ell _{1}\) minimization
- Optimal estimation in additive regression models
- High-dimensional additive modeling
- Asymptotics for Lasso-type estimators.
- On the Bayes-risk consistency of regularized boosting methods.
- Self-concordant analysis for logistic regression
- Simultaneous analysis of Lasso and Dantzig selector
- High-dimensional generalized linear models and the lasso
- Sparsity oracle inequalities for the Lasso
- Pathwise coordinate optimization
- Adaptive Lasso and group-Lasso for functional Poisson regression
- Optimization with Sparsity-Inducing Penalties
- Learning the Intensity of Time Events With Change-Points
- The Group Lasso for Logistic Regression
- Uncertainty principles and ideal atomic decomposition
- Sparse Additive Models
- Sparsity and Smoothness Via the Fused Lasso
- The Lasso, correlated design, and improved oracle inequalities
- Convex analysis and monotone operator theory in Hilbert spaces
- The elements of statistical learning. Data mining, inference, and prediction
- Random forests
- Stochastic gradient boosting.
This page was built for publication: