HARFE: hard-ridge random feature expansion
From MaRDI portal
Publication:6049834
DOI10.1007/s43670-023-00063-9zbMath1520.65012arXiv2202.02877OpenAlexW4385705583MaRDI QIDQ6049834
Esha Saha, Hayden Schaeffer, Giang Tran
Publication date: 18 September 2023
Published in: Sampling Theory, Signal Processing, and Data Analysis (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/2202.02877
Ridge regression; shrinkage estimators (Lasso) (62J07) Numerical optimization and variational techniques (65K10) Multidimensional problems (41A63) Algorithms for approximation of functions (65D15) Numerical approximation of high-dimensional functions; sparse grids (65D40)
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- A mathematical introduction to compressive sensing
- A branch-and-cut decomposition algorithm for solving chance-constrained mathematical programs with finite support
- Fast high-dimensional approximation with sparse occupancy trees
- Iterative hard thresholding for compressed sensing
- Hard thresholding pursuit algorithms: number of iterations
- Component selection and smoothing in multivariate nonparametric regression
- Kernel methods: a survey of current techniques
- Generalization bounds for sparse random feature expansions
- Surprises in high-dimensional ridgeless least squares interpolation
- Generalization error of random feature and kernel methods: hypercontractivity and kernel matrix concentration
- Sparse high-dimensional regression: exact scalable algorithms and phase transitions
- Just interpolate: kernel ``ridgeless regression can generalize
- Learning rates for the risk of kernel-based quantile regression estimators in additive models
- Hard Thresholding Pursuit: An Algorithm for Compressive Sensing
- Multivariate Regression and Machine Learning with Sums of Separable Functions
- Interpretable Approximation of High-Dimensional Data
- Benign overfitting in linear regression
- Fast Best Subset Selection: Coordinate Descent and Local Combinatorial Optimization Algorithms
- Scalable Algorithms for the Sparse Ridge Regression
- Approximation of High-Dimensional Periodic Functions with Fourier-Based Methods
- Reconciling modern machine-learning practice and the classical bias–variance trade-off
- Algorithms for learning sparse additive models with interactions in high dimensions*
- On the Equivalence between Kernel Quadrature Rules and Random Feature Expansions
- Learning Bounds for Kernel Regression Using Effective Data Dimensionality
- Subset Selection with Shrinkage: Sparse Linear Modeling When the SNR Is Low
This page was built for publication: HARFE: hard-ridge random feature expansion