Distributed Gradient Clustering: Convergence and the Effect of Initialization
Published in 58th Asilomar Conference on Signals, Systems, and Computers [To appear], 2024
Abstract: We study the effects of center initialization on the performance of a family of distributed gradient-based clustering algorithms introduced in Armacki et al. (2025), that work over connected networks of users. In the considered scenario, each user contains a local dataset and communicates only with its immediate neighbours, with the aim of finding a global clustering of the joint data. We perform extensive numerical experiments, evaluating the effects of center initialization on the performance of our family of methods, demonstrating that our methods are more resilient to the effects of initialization, compared to centralized gradient clustering. Next, inspired by the K-means++ initialization, we propose a novel distributed center initialization scheme, which is shown to improve the performance of our methods, compared to the baseline random initialization.
Recommended citation: Armacki, A., Sharma, H., Bajović, D., Jakovetić, D., Chakraborty, M., & Kar, S. (2024). Distributed Gradient Clustering: Convergence and the Effect of Initialization. 58th Asilomar Conference on Signals, Systems, and Computers [To appear] https://github.com/aarmacki/aarmacki.github.io/blob/master/publications/Asilomar_2024.pdf