The following pages link to Pegasos (Q20752):
Displayed 50 items.
- An efficient method for clustered multi-metric learning (Q2200672) (← links)
- Kernel-based online regression with canal loss (Q2242215) (← links)
- A stochastic trust region method for unconstrained optimization problems (Q2298821) (← links)
- Developing an online general type-2 fuzzy classifier using evolving type-1 rules (Q2302784) (← links)
- Decentralized hierarchical constrained convex optimization (Q2303528) (← links)
- Fast and strong convergence of online learning algorithms (Q2305549) (← links)
- Hyper-parameter optimization for support vector machines using stochastic gradient descent and dual coordinate descent (Q2308188) (← links)
- An efficient primal dual prox method for non-smooth optimization (Q2339936) (← links)
- Premise selection for mathematics by corpus analysis and kernel methods (Q2352489) (← links)
- Incremental accelerated gradient methods for SVM classification: study of the constrained approach (Q2355191) (← links)
- Stochastic subgradient descent method for large-scale robust chance-constrained support vector machines (Q2359408) (← links)
- Binary vectors for fast distance and similarity estimation (Q2362826) (← links)
- An optimal subgradient algorithm with subspace search for costly convex optimization problems (Q2415906) (← links)
- A fast SVD-hidden-nodes based extreme learning machine for large-scale data analytics (Q2418132) (← links)
- Image classification with the Fisher vector: theory and practice (Q2450399) (← links)
- Subgradient-based neural network for nonconvex optimization problems in support vector machines with indefinite kernels (Q2515282) (← links)
- New smoothing SVM algorithm with tight error bound and efficient reduced techniques (Q2636609) (← links)
- Improving kernel online learning with a snapshot memory (Q2673322) (← links)
- On the convergence analysis of asynchronous SGD for solving consistent linear systems (Q2685380) (← links)
- A subgradient method with non-monotone line search (Q2696908) (← links)
- Spectral projected subgradient method for nonsmooth convex optimization problems (Q2700023) (← links)
- Global Convergence of Online Limited Memory BFGS (Q2788401) (← links)
- (Q2788425) (← links)
- (Q2810841) (← links)
- Distributed Coordinate Descent Method for Learning with Big Data (Q2810888) (← links)
- (Q2933974) (← links)
- Block Stochastic Gradient Iteration for Convex and Nonconvex Optimization (Q2945126) (← links)
- An Introduction to Conditional Random Fields (Q3166528) (← links)
- How Effectively Train Large-Scale Machine Learning Models? (Q3299229) (← links)
- Distributed Block Coordinate Descent for Minimizing Partially Separable Functions (Q3462314) (← links)
- (Q4558169) (← links)
- (Q4558495) (← links)
- (Q4558543) (← links)
- Neural Networks and Deep Learning (Q4569250) (← links)
- Machine Learning for Text (Q4569273) (← links)
- Batched Stochastic Gradient Descent with Weighted Sampling (Q4609808) (← links)
- (Q4636973) (← links)
- (Q4637034) (← links)
- On the complexity of parallel coordinate descent (Q4638927) (← links)
- Optimization Methods for Large-Scale Machine Learning (Q4641709) (← links)
- Optimal Stochastic Approximation Algorithms for Strongly Convex Stochastic Composite Optimization I: A Generic Algorithmic Framework (Q4915174) (← links)
- Stochastic Subgradient Estimation Training for Support Vector Machines (Q4922840) (← links)
- Online training on a budget of support vector machines using twin prototypes (Q4969713) (← links)
- Large‐margin classification with multiple decision rules (Q4970185) (← links)
- Making the Last Iterate of SGD Information Theoretically Optimal (Q4987277) (← links)
- (Q5038378) (← links)
- (Q5116489) (← links)
- An efficient augmented Lagrangian method for support vector machine (Q5135259) (← links)
- New nonasymptotic convergence rates of stochastic proximal point algorithm for stochastic convex optimization (Q5162590) (← links)
- New machine-learning algorithms for prediction of Parkinson's disease (Q5172596) (← links)