DOI10.1007/s100970100031zbMath1037.62001OpenAlexW1995771589WikidataQ92198678 ScholiaQ92198678MaRDI QIDQ5945247
Pascal Massart, Lucien Birgé
Publication date: 2001
Published in: Journal of the European Mathematical Society (JEMS) (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s100970100031
Greedy algorithms for prediction,
Least angle regression. (With discussion),
Detecting possibly frequent change-points: wild binary segmentation 2 and steepest-drop model selection,
Near-ideal model selection by \(\ell _{1}\) minimization,
Minimal penalties for Gaussian model selection,
SLOPE is adaptive to unknown sparsity and asymptotically minimax,
A penalized criterion for variable selection in classification,
GSDAR: a fast Newton algorithm for \(\ell_0\) regularized generalized linear models with statistical guarantee,
Nonparametric denoising of signals with unknown local structure. I: Oracle inequalities,
Local inference by penalization method for biclustering model,
Solution of linear ill-posed problems by model selection and aggregation,
Adaptive density estimation using the blockwise Stein method,
Estimation of matrices with row sparsity,
Statistical estimation with model selection,
General maximum likelihood empirical Bayes estimation of normal means,
Semiparametric inference for mixtures of circular data,
Divergence rates of Markov order estimators and their application to statistical estimation of stationary ergodic processes,
How can we identify the sparsity structure pattern of high-dimensional data: an elementary statistical analysis to interpretable machine learning,
Adaptive confidence sets in \(L^2\),
Estimating the joint distribution of independent categorical variables via model selection,
Optimal rates for plug-in estimators of density level sets,
Adaptive estimation for Hawkes processes; application to genome analysis,
Empirical risk minimization as parameter choice rule for general linear regularization methods,
Estimating piecewise monotone signals,
Consistent model selection criteria and goodness-of-fit test for common time series models,
Regularization in statistics,
Adaptive and optimal online linear regression on \(\ell^1\)-balls,
Honest and adaptive confidence sets in \(L_p\),
Adaptive tests of linear hypotheses by model selection,
Rejoinder to the comments on: \(\ell _{1}\)-penalization for mixture regression models,
Estimator selection: a new method with applications to kernel density estimation,
Needles and straw in a haystack: posterior concentration for possibly sparse sequences,
A Bernstein-type inequality for suprema of random processes with applications to model selection in non-Gaussian regression,
On estimation of isotonic piecewise constant signals,
Model selection: from theory to practice,
SLOPE-adaptive variable selection via convex optimization,
General model selection estimation of a periodic regression with a Gaussian noise,
Statistical inference for the optimal approximating model,
Segmentation of the mean of heteroscedastic data via cross-validation,
Minimax risks for sparse regressions: ultra-high dimensional phenomenons,
Spatial adaptation in heteroscedastic regression: propagation approach,
Selecting the length of a principal curve within a Gaussian model,
Model selection in regression under structural constraints,
Optimal model selection in heteroscedastic regression using piecewise polynomial functions,
Adaptive estimation of linear functionals by model selection,
Simultaneous estimation of the mean and the variance in heteroscedastic Gaussian regression,
Model selection by resampling penalization,
MAP model selection in Gaussian regression,
The Lasso as an \(\ell _{1}\)-ball model selection procedure,
Sparsity considerations for dependent variables,
Oracle convergence rate of posterior under projection prior and Bayesian model selection,
Estimating composite functions by model selection,
A new approach to estimator selection,
Model selection and sharp asymptotic minimaxity,
Consistent change-point detection with kernels,
Estimator selection with respect to Hellinger-type risks,
Risk hull method and regularization by projections of ill-posed inverse problems,
Sparse PCA: optimal rates and adaptive estimation,
A general framework for Bayes structured linear models,
Estimation and variable selection with exponential weights,
Model selection for simplicial approximation,
Wild binary segmentation for multiple change-point detection,
An introduction to the Bayes information criterion: theoretical foundations and interpretation,
Gaussian linear model selection in a dependent context,
The Goldenshluger-Lepski method for constrained least-squares estimators over RKHSs,
Multiple change-points detection by empirical Bayesian information criteria and Gibbs sampling induced stochastic search,
Tail-greedy bottom-up data decompositions and fast multiple change-point detection,
Sparse recovery under weak moment assumptions,
Empirical Bayes oracle uncertainty quantification for regression,
Simple arbitrage,
Model selection for Gaussian regression with random design,
Bayesian model selection and the concentration of the posterior of hyperparameters,
A Kernel Multiple Change-point Algorithm via Model Selection,
A new algorithm for fixed design regression and denoising,
Adaptive estimation of stationary Gaussian fields,
A survey of cross-validation procedures for model selection,
Maxisets for model selection,
On signal reconstruction in white noise using dictionaries,
Consistent selection of the number of change-points via sample-splitting,
Estimator selection in the Gaussian setting,
High-dimensional Gaussian model selection on a Gaussian design,
Spike and slab empirical Bayes sparse credible sets,
Needles and straw in a haystack: robust confidence for possibly sparse sequences,
Mixing least-squares estimators when the variance is unknown,
Generalized mirror averaging and \(D\)-convex aggregation,
A MOM-based ensemble method for robustness, subsampling and hyperparameter tuning,
Estimating the intensity of a random measure by histogram type estimators,
Penalized projection estimators of the Aalen multiplicative intensity,
On the stability of the risk hull method for projection estimators,
Slope heuristics: overview and implementation,
A breakpoint detection in the mean model with heterogeneous variance on fixed time intervals,
A simple forward selection procedure based on false discovery rate control,
Empirical Bayesian test of the smoothness,
Joint segmentation of wind speed and direction using a hierarchical model,
Gaussian model selection with an unknown variance,
Estimation of the conditional risk in classification: the swapping method,
Consistency of a range of penalised cost approaches for detecting multiple changepoints,
Oracle inequalities for inverse problems,
Estimation and model selection for model-based clustering with the conditional classification likelihood,
Adaptive estimation over anisotropic functional classes via oracle approach,
Some theoretical results regarding the polygonal distribution,
Concentration inequalities, counting processes and adaptive statistics,
PREDICTION/ESTIMATION WITH SIMPLE LINEAR MODELS: IS IT REALLY THAT SIMPLE?,
Frequentist validity of Bayesian limits,
Adaptive and efficient estimation in the Gaussian sequence model,
Optimal false discovery control of minimax estimators,
Data-driven selection of the number of change-points via error rate control,
Robust oracle estimation and uncertainty quantification for possibly sparse quantiles,
Data-driven model selection for same-realization predictions in autoregressive processes,
Optimal change-point detection and localization,
Theory of adaptive estimation,
Fast exact Bayesian inference for sparse signals in the normal sequence model,
Detection of multiple change-points in multivariate data,
Sparse model selection under heterogeneous noise: exact penalisation and data-driven thresholding,
Model selection for density estimation with \(\mathbb L_2\)-loss,
Aggregation for Gaussian regression,
A Semiparametric Change-Point Regression Model for Longitudinal Observations,
On optimality of Bayesian testimation in the normal means problem,
Detection of multiple change-points in multivariate time series,
The Dantzig selector: statistical estimation when \(p\) is much larger than \(n\). (With discussions and rejoinder).,
Testing the linearity in partially linear models,
Adaptive sequential estimation for ergodic diffusion processes in quadratic metric,
Prediction of time series by statistical learning: general losses and fast rates,
Some applications of concentration inequalities to statistics,
Covariate selection for semiparametric hazard function regression models,
A Modified Bayes Information Criterion with Applications to the Analysis of Comparative Genomic Hybridization Data,
Model selection for estimating the non zero components of a Gaussian vector,
Adaptive nonparametric confidence sets,
Adapting to unknown sparsity by controlling the false discovery rate,
Adaptive minimax estimation of a fractional derivative,
Compensator and exponential inequalities for some suprema of counting processes,
Nonparametric Estimation of the Hazard Function by Using a Model Selection Method: Estimation of Cancer Deaths in Hiroshima Atomic Bomb Survivors,
Block-Diagonal Covariance Selection for High-Dimensional Gaussian Graphical Models,
Sharp oracle inequalities and slope heuristic for specification probabilities estimation in discrete random fields,
High-dimensional regression with unknown variance,
Sparse estimation by exponential weighting,
How many bins should be put in a regular histogram,
Risk hull method for spectral regularization in linear statistical inverse problems,
Local comparison of empirical distributions via nonparametric regression,
Adaptive estimation with soft thresholding penalties,
Mixture of linear mixed models for clustering gene expression profiles from repeated microarray experiments,
Road trafficking description and short term travel time forecasting, with a classification method,
Histogram selection in non Gaussian regression,
BOOTSTRAP TESTS FOR THE ERROR DISTRIBUTION IN LINEAR AND NONPARAMETRIC REGRESSION MODELS,
Multiscale Change Point Inference,
Inference for single and multiple change-points in time series,
Empirical Bayes selection of wavelet thresholds,
Local Posterior Concentration Rate for Multilevel Sparse Sequences,
Model selection for regression on a random design