[2603.20507] Distributed Gradient Clustering: Convergence and the Effect of Initialization
About this article
Abstract page for arXiv paper 2603.20507: Distributed Gradient Clustering: Convergence and the Effect of Initialization
Computer Science > Machine Learning arXiv:2603.20507 (cs) [Submitted on 20 Mar 2026] Title:Distributed Gradient Clustering: Convergence and the Effect of Initialization Authors:Aleksandar Armacki, Himkant Sharma, Dragana Bajović, Dušan Jakovetić, Mrityunjoy Chakraborty, Soummya Kar View a PDF of the paper titled Distributed Gradient Clustering: Convergence and the Effect of Initialization, by Aleksandar Armacki and 5 other authors View PDF HTML (experimental) Abstract:We study the effects of center initialization on the performance of a family of distributed gradient-based clustering algorithms introduced in [1], that work over connected networks of users. In the considered scenario, each user contains a local dataset and communicates only with its immediate neighbours, with the aim of finding a global clustering of the joint data. We perform extensive numerical experiments, evaluating the effects of center initialization on the performance of our family of methods, demonstrating that our methods are more resilient to the effects of initialization, compared to centralized gradient clustering [2]. Next, inspired by the $K$-means++ initialization [3], we propose a novel distributed center initialization scheme, which is shown to improve the performance of our methods, compared to the baseline random initialization. Comments: Subjects: Machine Learning (cs.LG); Machine Learning (stat.ML) Cite as: arXiv:2603.20507 [cs.LG] (or arXiv:2603.20507v1 [cs.LG] for this version) ht...