The following pages link to Hideaki Iiduka (Q286843):
Displaying 27 items.
- Iterative Algorithm for Triple-Hierarchical Constrained Nonconvex Optimization Problem and Its Application to Network Bandwidth Allocation (Q4899015) (← links)
- (Q5024965) (← links)
- (Q5052078) (← links)
- (Q5084579) (← links)
- (Q5147189) (← links)
- (Q5151487) (← links)
- (Q5158466) (← links)
- (Q5158618) (← links)
- (Q5160447) (← links)
- Distributed Optimization for Network Resource Allocation With Nonsmooth Utility Functions (Q5211931) (← links)
- Iterative methods for parallel convex optimization with fixed point constraints (Q5244136) (← links)
- Acceleration Method Combining Broadcast and Incremental Distributed Optimization Algorithms (Q5245368) (← links)
- Fixed Point Optimization Algorithms for Distributed Optimization in Networked Systems (Q5300515) (← links)
- (Q5308301) (← links)
- Two stochastic optimization algorithms for convex optimization with fixed point constraints (Q5379458) (← links)
- (Q5465194) (← links)
- (Q5465310) (← links)
- (Q5477618) (← links)
- (Q5495617) (← links)
- Theoretical analysis of Adam using hyperparameters close to one without Lipschitz smoothness (Q6145578) (← links)
- Incremental and Parallel Machine Learning Algorithms with Automated Learning Rate Adjustments (Q6273466) (← links)
- Convergence of Riemannian Stochastic Gradient Descent on Hadamard Manifold (Q6463359) (← links)
- Modified memoryless spectral-scaling Broyden family on Riemannian manifolds (Q6511142) (← links)
- Modified memoryless spectral-scaling Broyden family on Riemannian manifolds (Q6608756) (← links)
- Applying conditional subgradient-like directions to the modified Krasnosel'skiĭ-Mann fixed point algorithm based on the three-term conjugate gradient method (Q6637963) (← links)
- Convergence of Riemannian stochastic gradient descent on Hadamard manifold (Q6664323) (← links)
- A general framework of Riemannian adaptive optimization methods with a convergence analysis (Q6742601) (← links)