Statistics for high-dimensional data. Methods, theory and applications.
From MaRDI portal
Publication:532983
DOI10.1007/978-3-642-20192-9zbMath1273.62015OpenAlexW4247571494WikidataQ57256043 ScholiaQ57256043MaRDI QIDQ532983
Publication date: 2 May 2011
Published in: Springer Series in Statistics (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/978-3-642-20192-9
Estimation in multivariate analysis (62H12) Linear inference, regression (62Jxx) Parametric inference (62Fxx) Research exposition (monographs, survey articles) pertaining to statistics (62-02) Applications of statistics (62Pxx)
Related Items (only showing first 100 items - show all)
Greedy algorithms for prediction ⋮ An introduction to recent advances in high/infinite dimensional statistics ⋮ Direct shrinkage estimation of large dimensional precision matrix ⋮ Worst possible sub-directions in high-dimensional models ⋮ Optimal sampling designs for nonparametric estimation of spatial averages of random fields ⋮ Robust methods for inferring sparse network structures ⋮ On cross-validated Lasso in high dimensions ⋮ Interquantile shrinkage and variable selection in quantile regression ⋮ Significance testing in non-sparse high-dimensional linear models ⋮ On the prediction loss of the Lasso in the partially labeled setting ⋮ D-learning to estimate optimal individual treatment rules ⋮ The main contributions of robust statistics to statistical science and a new challenge ⋮ Local linear smoothing for sparse high dimensional varying coefficient models ⋮ Forward variable selection for sparse ultra-high-dimensional generalized varying coefficient models ⋮ Notes on the dimension dependence in high-dimensional central limit theorems for hyperrectangles ⋮ Regularity properties for sparse regression ⋮ Local independence feature screening for nonparametric and semiparametric models by marginal empirical likelihood ⋮ Best subset selection via a modern optimization lens ⋮ High dimensional regression for regenerative time-series: an application to road traffic modeling ⋮ Variable selection in multivariate linear models for functional data via sparse regularization ⋮ Censored linear model in high dimensions. Penalised linear regression on high-dimensional data with left-censored response variable ⋮ Model selection for factorial Gaussian graphical models with an application to dynamic regulatory networks ⋮ The use of vector bootstrapping to improve variable selection precision in Lasso models ⋮ The benefit of group sparsity in group inference with de-biased scaled group Lasso ⋮ Geometric inference for general high-dimensional linear inverse problems ⋮ Solution of linear ill-posed problems using overcomplete dictionaries ⋮ Oracle inequalities, variable selection and uniform inference in high-dimensional correlated random effects panel data models ⋮ Econometric estimation with high-dimensional moment equalities ⋮ A rank-corrected procedure for matrix completion with fixed basis coefficients ⋮ Fusion of hard and soft information in nonparametric density estimation ⋮ PBoostGA: pseudo-boosting genetic algorithm for variable ranking and selection ⋮ Sub-optimality of some continuous shrinkage priors ⋮ \(\ell_{0}\)-penalized maximum likelihood for sparse directed acyclic graphs ⋮ Statistical significance in high-dimensional linear models ⋮ The geometry of least squares in the 21st century ⋮ The Bernstein-Orlicz norm and deviation inequalities ⋮ Marginal empirical likelihood and sure independence feature screening ⋮ Bayesian regression based on principal components for high-dimensional data ⋮ Non-negative least squares for high-dimensional linear models: consistency and sparse recovery without regularization ⋮ Correlated variables in regression: clustering and sparse estimation ⋮ Discussion of ``Correlated variables in regression: clustering and sparse estimation ⋮ Bayesian linear regression with sparse priors ⋮ Functional additive regression ⋮ A new test of independence for high-dimensional data ⋮ Alignment based kernel learning with a continuous set of base kernels ⋮ Model selection and structure specification in ultra-high dimensional generalised semi-varying coefficient models ⋮ Lasso-type estimators for semiparametric nonlinear mixed-effects models estimation ⋮ \(\ell_1\)-regularization of high-dimensional time-series models with non-Gaussian and heteroskedastic errors ⋮ Prediction consistency of forward iterated regression and selection technique ⋮ On the strong convergence of the optimal linear shrinkage estimator for large dimensional covariance matrix ⋮ Selection of tuning parameters in bridge regression models via Bayesian information criterion ⋮ New concentration inequalities for suprema of empirical processes ⋮ Does modeling lead to more accurate classification? A study of relative efficiency in linear classification ⋮ A new perspective on least squares under convex constraint ⋮ CAM: causal additive models, high-dimensional order search and penalized regression ⋮ \(L_1\)-penalization in functional linear regression with subgaussian design ⋮ On higher order isotropy conditions and lower bounds for sparse quadratic forms ⋮ Normalized and standard Dantzig estimators: two approaches ⋮ High-dimensional inference in misspecified linear models ⋮ Operator-valued kernel-based vector autoregressive models for network inference ⋮ On the residual empirical process based on the ALASSO in high dimensions and its functional oracle property ⋮ Oracle inequalities for high dimensional vector autoregressions ⋮ A flexible semiparametric forecasting model for time series ⋮ Weighted \(\ell_1\)-penalized corrected quantile regression for high dimensional measurement error models ⋮ Robust inference on average treatment effects with possibly more covariates than observations ⋮ Toward a unified theory of sparse dimensionality reduction in Euclidean space ⋮ Smooth predictive model fitting in regression ⋮ Consistent learning by composite proximal thresholding ⋮ Interpreting latent variables in factor models via convex optimization ⋮ Additive model selection ⋮ Robust matrix completion ⋮ Finding a low-rank basis in a matrix subspace ⋮ Sparse recovery under weak moment assumptions ⋮ Sparse clustering of functional data ⋮ Covariate Selection in High-Dimensional Generalized Linear Models With Measurement Error ⋮ Sparse classification with paired covariates ⋮ Model-assisted inference for treatment effects using regularized calibrated estimation with high-dimensional data ⋮ On asymptotically optimal confidence regions and tests for high-dimensional models ⋮ Lasso Inference for High-Dimensional Time Series ⋮ Variable selection in discrete survival models including heterogeneity ⋮ Selection by partitioning the solution paths ⋮ Gelfand numbers related to structured sparsity and Besov space embeddings with small mixed smoothness ⋮ Double Machine Learning for Partially Linear Mixed-Effects Models with Repeated Measurements ⋮ Regularizing Double Machine Learning in Partially Linear Endogenous Models ⋮ Confidence intervals for high-dimensional inverse covariance estimation ⋮ The Hardness of Conditional Independence Testing and the Generalised Covariance Measure ⋮ Geometric median and robust estimation in Banach spaces ⋮ Estimation of high-dimensional graphical models using regularized score matching ⋮ An Automated Approach Towards Sparse Single-Equation Cointegration Modelling ⋮ A Projection Based Conditional Dependence Measure with Applications to High-dimensional Undirected Graphical Models ⋮ Main effects and interactions in mixed and incomplete data frames ⋮ Honest confidence regions and optimality in high-dimensional precision matrix estimation ⋮ Joint variable and rank selection for parsimonious estimation of high-dimensional matrices ⋮ Entropy and sampling numbers of classes of ridge functions ⋮ Sample size determination for training cancer classifiers from microarray and RNA-seq data ⋮ Sparse hierarchical regression with polynomials ⋮ Model selection in linear mixed models ⋮ Orthogonal one step greedy procedure for heteroscedastic linear models ⋮ A focused information criterion for graphical models in fMRI connectivity with high-dimensional data ⋮ Estimation of a delta-contaminated density of a random intensity of Poisson data
This page was built for publication: Statistics for high-dimensional data. Methods, theory and applications.