Sparse high-dimensional regression: exact scalable algorithms and phase transitions (Q2176621): Difference between revisions

From MaRDI portal
Importer (talk | contribs)
Changed an Item
ReferenceBot (talk | contribs)
Changed an Item
 
Property / cites work
 
Property / cites work: Branch-and-Price: Column Generation for Solving Huge Integer Programs / rank
 
Normal rank
Property / cites work
 
Property / cites work: Best subset selection via a modern optimization lens / rank
 
Normal rank
Property / cites work
 
Property / cites work: Sparse regression: scalable algorithms and empirical performance / rank
 
Normal rank
Property / cites work
 
Property / cites work: Sparse high-dimensional regression: exact scalable algorithms and phase transitions / rank
 
Normal rank
Property / cites work
 
Property / cites work: A brief history of linear and mixed-integer programming computation / rank
 
Normal rank
Property / cites work
 
Property / cites work: Statistics for high-dimensional data. Methods, theory and applications. / rank
 
Normal rank
Property / cites work
 
Property / cites work: Observed universality of phase transitions in high-dimensional geometry, with implications for modern data analysis and signal processing / rank
 
Normal rank
Property / cites work
 
Property / cites work: An outer-approximation algorithm for a class of mixed-integer nonlinear programs / rank
 
Normal rank
Property / cites work
 
Property / cites work: Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties / rank
 
Normal rank
Property / cites work
 
Property / cites work: Solving mixed integer nonlinear programs by outer approximation / rank
 
Normal rank
Property / cites work
 
Property / cites work: Regressions by Leaps and Bounds / rank
 
Normal rank
Property / cites work
 
Property / cites work: Sparse high-dimensional linear regression. Estimating squared error and a phase transition / rank
 
Normal rank
Property / cites work
 
Property / cites work: Updating the Inverse of a Matrix / rank
 
Normal rank
Property / cites work
 
Property / cites work: Q5251797 / rank
 
Normal rank
Property / cites work
 
Property / cites work: Branch-and-Bound Methods: A Survey / rank
 
Normal rank
Property / cites work
 
Property / cites work: Matching pursuits with time-frequency dictionaries / rank
 
Normal rank
Property / cites work
 
Property / cites work: On general minimax theorems / rank
 
Normal rank
Property / cites work
 
Property / cites work: Q4864293 / rank
 
Normal rank
Property / cites work
 
Property / cites work: Q4250979 / rank
 
Normal rank
Property / cites work
 
Property / cites work: Sharp Thresholds for High-Dimensional and Noisy Sparsity Recovery Using $\ell _{1}$-Constrained Quadratic Programming (Lasso) / rank
 
Normal rank
Property / cites work
 
Property / cites work: On the Performance of Sparse Recovery Via $\ell_p$-Minimization $(0 \leq p \leq 1)$ / rank
 
Normal rank
Property / cites work
 
Property / cites work: Q2880935 / rank
 
Normal rank
Property / cites work
 
Property / cites work: Nearly unbiased variable selection under minimax concave penalty / rank
 
Normal rank
Property / cites work
 
Property / cites work: Does $\ell _{p}$ -Minimization Outperform $\ell _{1}$ -Minimization? / rank
 
Normal rank
Property / cites work
 
Property / cites work: Regularization and Variable Selection Via the Elastic Net / rank
 
Normal rank

Latest revision as of 14:57, 22 July 2024

scientific article
Language Label Description Also known as
English
Sparse high-dimensional regression: exact scalable algorithms and phase transitions
scientific article

    Statements

    Sparse high-dimensional regression: exact scalable algorithms and phase transitions (English)
    0 references
    0 references
    5 May 2020
    0 references
    This paper proposes a novel binary convex reformulation of the sparse regression problem. A new cutting plane method is devised to exactly solve the problem. Evidence is provided that the cutting plane algorithm fast scales to provable optimality for sample sizes \(n\) and number of regressors \(p\) in the 100,000s. A new phase transition is found in the ability to recover the true coefficients of the sparse regression problem and also in the ability to solve it. An extension of the sparse linear regression to the case of nonlinear regression by augmenting the input data \(X\) with auxiliary nonlinear transformations is also discussed.
    0 references
    0 references
    sparse regression
    0 references
    kernel learning
    0 references
    integer optimization
    0 references
    convex optimization
    0 references
    subset selection
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references