Mathematical programming for simultaneous feature selection and outlier detection under l1 norm
From MaRDI portal
Publication:6565451
Cites work
- scientific article; zbMATH DE number 845714 (Why is no real title available?)
- A branch-and-cut algorithm for mixed-integer bilinear programming
- A new look at the statistical model identification
- A survey of outlier detection methodologies
- An introduction to statistical learning. With applications in R
- Best subset selection via a modern optimization lens
- Best subset selection, persistence in high-dimensional statistical learning and optimization under \(l_1\) constraint
- Computability of global solutions to factorable nonconvex programs: Part I — Convex underestimating problems
- Disjunctive programming: Properties of the convex hull of feasible points
- Joint outlier detection and variable selection using discrete optimization
- Kernel density outlier detector
- LAD regression for detecting outliers in response and explanatory variables
- Least Median of Squares Regression
- Leveraged least trimmed absolute deviations
- Mixed integer linear programming formulation techniques
- Mixed-integer nonlinear programs featuring ``on/off constraints
- Nearly unbiased variable selection under minimax concave penalty
- On handling indicator constraints in mixed integer programming
- On mathematical programming with indicator constraints
- Piecewise Linear Function Fitting via Mixed-Integer Linear Programming
- Quadratic mixed integer programming and support vectors for deleting outliers in robust regression
- Regularization and Variable Selection Via the Elastic Net
- Relaxed Lasso
- Robust linear regression for high-dimensional data: an overview
- Robust subset selection
- Simultaneous estimation and variable selection in median regression using Lasso-type penalty
- Solving mixed integer bilinear problems using MILP formulations
- Sparse Approximate Solutions to Linear Systems
- Sparse least trimmed squares regression for analyzing high-dimensional large data sets
- Sparse regression for large data sets with outliers
- The Adaptive Lasso and Its Oracle Properties
- The Dantzig selector: statistical estimation when \(p\) is much larger than \(n\). (With discussions and rejoinder).
- The Elements of Statistical Learning
- Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties
- Variable selection and parameter estimation via WLAD-SCAD with a diverging number of parameters
- Weighted LAD-LASSO method for robust parameter estimation and variable selection in regression
This page was built for publication: Mathematical programming for simultaneous feature selection and outlier detection under l1 norm
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6565451)