Stochastic Backward Euler: An Implicit Gradient Descent Algorithm for k-means Clustering
From MaRDI portal
Publication:6292837
DOI10.1007/S10915-018-0744-4arXiv1710.07746MaRDI QIDQ6292837FDOQ6292837
Minh-Quy Pham, Penghang Yin, Adam Oberman, Stanley Osher
Publication date: 20 October 2017
Abstract: In this paper, we propose an implicit gradient descent algorithm for the classic -means problem. The implicit gradient step or backward Euler is solved via stochastic fixed-point iteration, in which we randomly sample a mini-batch gradient in every iteration. It is the average of the fixed-point trajectory that is carried over to the next gradient step. We draw connections between the proposed stochastic backward Euler and the recent entropy stochastic gradient descent (Entropy-SGD) for improving the training of deep neural networks. Numerical experiments on various synthetic and real datasets show that the proposed algorithm provides better clustering results compared to -means algorithms in the sense that it decreased the objective function (the cluster) and is much more robust to initialization.
This page was built for publication: Stochastic Backward Euler: An Implicit Gradient Descent Algorithm for $k$-means Clustering
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6292837)