The geometry of least squares in the 21st century
From MaRDI portal
(Redirected from Publication:373539)
Abstract: It has been over 200 years since Gauss's and Legendre's famous priority dispute on who discovered the method of least squares. Nevertheless, we argue that the normal equations are still relevant in many facets of modern statistics, particularly in the domain of high-dimensional inference. Even today, we are still learning new things about the law of large numbers, first described in Bernoulli's Ars Conjectandi 300 years ago, as it applies to high dimensional inference. The other insight the normal equations provide is the asymptotic Gaussianity of the least squares estimators. The general form of the Gaussian distribution, Gaussian processes, are another tool used in modern high-dimensional inference. The Gaussian distribution also arises via the central limit theorem in describing weak convergence of the usual least squares estimators. In terms of high-dimensional inference, we are still missing the right notion of weak convergence. In this mostly expository work, we try to describe how both the normal equations and the theory of Gaussian processes, what we refer to as the "geometry of least squares," apply to many questions of current interest.
Recommendations
Cites work
- scientific article; zbMATH DE number 3786012 (Why is no real title available?)
- scientific article; zbMATH DE number 1560711 (Why is no real title available?)
- scientific article; zbMATH DE number 2107836 (Why is no real title available?)
- scientific article; zbMATH DE number 845714 (Why is no real title available?)
- scientific article; zbMATH DE number 236540 (Why is no real title available?)
- A Gaussian kinematic formula
- A general expression for the distribution of the maximum of a Gaussian field and the approximation of the tail
- A lasso for hierarchical interactions
- A perturbation method for inference on regularized regression estimates
- A unified framework for high-dimensional analysis of \(M\)-estimators with decomposable regularizers
- Adaptive confidence intervals for the test error in classification
- Boundary corrections for the expected Euler characteristic of excursion sets of random fields, with an application to astrophysics
- Compressed sensing
- Curvature Measures
- Degrees of freedom in lasso problems
- Detecting Sparse Signals in Random Fields, With an Application to Brain Mapping
- Distributed optimization and statistical learning via the alternating direction method of multipliers
- Estimation of the mean of a multivariate normal distribution
- Exact matrix completion via convex optimization
- Excursion sets of three classes of stable random fields
- Gauss and the invention of least squares
- High level excursion set geometry for non-Gaussian infinitely divisible random fields
- High-dimensional variable selection
- Intrinsic volumes and Gaussian processes
- Least angle regression. (With discussion)
- Model Selection and Estimation in Regression with Grouped Variables
- NESTA: A fast and accurate first-order method for sparse recovery
- On the Volume of Tubes
- On the equivalence of the tube and Euler characteristic methods for the distribution of the maximum of Gaussian fields over piecewise smooth domains
- Pathwise coordinate optimization
- Proximal methods for hierarchical sparse coding
- Random Fields and Geometry
- Random fields and the geometry of Wiener space
- Random fields of multivariate test statistics, with applications to shape analysis
- Rejoinder: ``A significance test for the lasso
- Robust uncertainty principles: exact signal reconstruction from highly incomplete frequency information
- Sign-constrained least squares estimation for high-dimensional regression
- Simultaneous analysis of Lasso and Dantzig selector
- Smooth minimization of non-smooth functions
- Sparse inverse covariance estimation with the graphical lasso
- Spectral regularization algorithms for learning large incomplete matrices
- Statistical significance in high-dimensional linear models
- Statistics for high-dimensional data. Methods, theory and applications.
- Support union recovery in high-dimensional multivariate regression
- Tail probabilities of the maxima of Gaussian random fields
- The Lasso problem and uniqueness
- The expected number of local maxima of a random field and the volume of tubes
- The geometry of exponential families
- The solution path of the generalized lasso
- Validity of the expected Euler characteristic heuristic
- Weights of \(\overline{\chi}^2\) distribution for smooth or piecewise smooth cone alternatives
Cited in
(5)- Geometry and applied statistics
- Inference in adaptive regression via the Kac-Rice formula
- Recovering structured signals in noise: least-squares meets compressed sensing
- Penalized least square in sparse setting with convex penalty and non Gaussian errors
- A tutorial history of least squares with applications to astronomy and geodesy
This page was built for publication: The geometry of least squares in the 21st century
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q373539)