Trading Accuracy for Sparsity in Optimization Problems with Sparsity Constraints
From MaRDI portal
Publication:3083309
DOI10.1137/090759574zbMath1208.68226OpenAlexW1994520254MaRDI QIDQ3083309
Tong Zhang, Nathan Srebro, Shai Shalev-Shwartz
Publication date: 21 March 2011
Published in: SIAM Journal on Optimization (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1137/090759574
Related Items
Greedy strategies for convex optimization, Convex optimization on Banach spaces, Fast and scalable Lasso via stochastic Frank-Wolfe methods with a convergence guarantee, Biorthogonal greedy algorithms in convex optimization, Convergence and rate of convergence of some greedy algorithms in convex optimization, Gradient projection Newton pursuit for sparsity constrained optimization, Generalized greedy alternatives, Adaptive and optimal online linear regression on \(\ell^1\)-balls, Nonlinear tensor product approximation of functions, Fully corrective gradient boosting with squared hinge: fast learning rates and early stopping, Unnamed Item, Unnamed Item, Asymptotic linear convergence of fully-corrective generalized conditional gradient methods, A greedy Newton-type method for multiple sparse constraint problem, Sparse Approximation by Greedy Algorithms, Non-Negative Sparse Regression and Column Subset Selection with L1 Error, Greedy expansions in convex optimization, Decomposable norm minimization with proximal-gradient homotopy algorithm, Deviation optimal learning using greedy \(Q\)-aggregation, Unnamed Item, Accelerated proximal stochastic dual coordinate ascent for regularized loss minimization, Structured sparsity through convex optimization, Generalized Conditional Gradient for Sparse Estimation, Fast algorithms for supermodular and non-supermodular minimization via bi-criteria strategy, Greedy approximation in convex optimization, Gradient projection Newton algorithm for sparse collaborative learning using synthetic and real datasets of applications