The geometry of least squares in the 21st century
From MaRDI portal
Publication:373539
DOI10.3150/12-BEJSP15zbMATH Open1402.62157arXiv1309.7837MaRDI QIDQ373539FDOQ373539
Authors: Jonathan Taylor
Publication date: 17 October 2013
Published in: Bernoulli (Search for Journal in Brave)
Abstract: It has been over 200 years since Gauss's and Legendre's famous priority dispute on who discovered the method of least squares. Nevertheless, we argue that the normal equations are still relevant in many facets of modern statistics, particularly in the domain of high-dimensional inference. Even today, we are still learning new things about the law of large numbers, first described in Bernoulli's Ars Conjectandi 300 years ago, as it applies to high dimensional inference. The other insight the normal equations provide is the asymptotic Gaussianity of the least squares estimators. The general form of the Gaussian distribution, Gaussian processes, are another tool used in modern high-dimensional inference. The Gaussian distribution also arises via the central limit theorem in describing weak convergence of the usual least squares estimators. In terms of high-dimensional inference, we are still missing the right notion of weak convergence. In this mostly expository work, we try to describe how both the normal equations and the theory of Gaussian processes, what we refer to as the "geometry of least squares," apply to many questions of current interest.
Full work available at URL: https://arxiv.org/abs/1309.7837
Recommendations
Linear regression; mixed models (62J05) Gaussian processes (60G15) Ridge regression; shrinkage estimators (Lasso) (62J07) History of statistics (62-03) History of mathematics in the 21st century (01A61)
Cites Work
- NESTA: A fast and accurate first-order method for sparse recovery
- Spectral regularization algorithms for learning large incomplete matrices
- A lasso for hierarchical interactions
- Least angle regression. (With discussion)
- Pathwise coordinate optimization
- Title not available (Why is that?)
- Title not available (Why is that?)
- Statistics for high-dimensional data. Methods, theory and applications.
- Estimation of the mean of a multivariate normal distribution
- Distributed optimization and statistical learning via the alternating direction method of multipliers
- Simultaneous analysis of Lasso and Dantzig selector
- Curvature Measures
- Model Selection and Estimation in Regression with Grouped Variables
- A unified framework for high-dimensional analysis of \(M\)-estimators with decomposable regularizers
- The solution path of the generalized lasso
- Smooth minimization of non-smooth functions
- Sparse inverse covariance estimation with the graphical lasso
- Sign-constrained least squares estimation for high-dimensional regression
- Statistical significance in high-dimensional linear models
- Degrees of freedom in lasso problems
- Title not available (Why is that?)
- Proximal methods for hierarchical sparse coding
- High-dimensional variable selection
- Exact matrix completion via convex optimization
- The Lasso problem and uniqueness
- Robust uncertainty principles: exact signal reconstruction from highly incomplete frequency information
- On the Volume of Tubes
- Compressed sensing
- Tail probabilities of the maxima of Gaussian random fields
- Adaptive confidence intervals for the test error in classification
- A perturbation method for inference on regularized regression estimates
- Random Fields and Geometry
- On the equivalence of the tube and Euler characteristic methods for the distribution of the maximum of Gaussian fields over piecewise smooth domains
- Validity of the expected Euler characteristic heuristic
- Excursion sets of three classes of stable random fields
- Random fields and the geometry of Wiener space
- Title not available (Why is that?)
- A general expression for the distribution of the maximum of a Gaussian field and the approximation of the tail
- High level excursion set geometry for non-Gaussian infinitely divisible random fields
- Rejoinder: ``A significance test for the lasso
- Detecting Sparse Signals in Random Fields, With an Application to Brain Mapping
- The geometry of exponential families
- Support union recovery in high-dimensional multivariate regression
- Boundary corrections for the expected Euler characteristic of excursion sets of random fields, with an application to astrophysics
- Gauss and the invention of least squares
- The expected number of local maxima of a random field and the volume of tubes
- Weights of \(\overline{\chi}^2\) distribution for smooth or piecewise smooth cone alternatives
- Random fields of multivariate test statistics, with applications to shape analysis
- A Gaussian kinematic formula
- Intrinsic volumes and Gaussian processes
- Title not available (Why is that?)
Cited In (5)
- Geometry and applied statistics
- Inference in adaptive regression via the Kac-Rice formula
- Recovering structured signals in noise: least-squares meets compressed sensing
- Penalized least square in sparse setting with convex penalty and non Gaussian errors
- A tutorial history of least squares with applications to astronomy and geodesy
Uses Software
This page was built for publication: The geometry of least squares in the 21st century
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q373539)