About
I am a post-doctoral researcher at École Polytechnique Fédérale de Lausanne, hosted by professor Ali H. Sayed. I hold a PhD degree in electrical and computer engineering from Carnegie Mellon University, where I had the privilege of being advised by professor Soummya Kar. Broadly, my interests lie in signal processing and machine learning. More specifically, I study theoretical guarantees of both centralized and large-scale multi-agent learning systems, in the presence of phenomena such as heavy-tailed noise and statistical heterogeneity between different users’ datasets.
If you have any questions about my research, feel free to reach out.
News
- October 2025 - A new preprint is out. Title: “Improved High-probability Convergence Guarantees of Decentralized SGD”.
 - October 2025 - Our paper “Large Deviation Upper Bounds and Improved MSE Rates of Nonlinear SGD: Heavy-tailed Noise and Power of Symmetry’’ got accepted for publication at SIAM Journal on Optimization.
 - July 2025 - A new preprint is out. Title: “Optimal High-probability Convergence of Nonlinear SGD under Heavy-tailed Noise via Symmetrization”.
 - Jun 2025 - I defended my PhD thesis! A big thank you to my advisor and committee members, as well as the many people that I had the pleasure of collaborating with during my PhD studies. I will be joining EPFL as a post-doctoral researcher, hosted by professor Ali H. Sayed. Looking forward to getting started in September!
 - May 2025 - Our paper “Toward Understanding the Improved Robustness to Initialization in Distributed Clustering” got accepted for publication at EUSIPCO2025 Conference.
 - January 2025 - Our paper “High-probability Convergence Bounds for Online Nonlinear Stochastic Gradient Descent Under Heavy-tailed Noise’’ got accepted for publication at the International Conference on Artificial Intelligence and Statistics (AISTATS).
 - January 2025 - Our paper “Distributed Center-based Clustering: A Unified Framework’’ got accepted for publication at IEEE Transactions on Signal Processing.
 - October 2024 - A new preprint is out. Title: “Large Deviations and Improved Mean-squared Error Rates of Nonlinear SGD: Heavy-tailed Noise and Power of Symmetry’’.
 - October 2024 - A new preprint is out. Title: “Nonlinear Stochastic Gradient Descent and Heavy-tailed Noise: A Unified Framework and High-probability Guarantees’’.
 - February 2024 - A new preprint is out. Title: “A Unified Framework for Gradient-based Clustering of Distributed Data’’.
 - December 2023 - Our paper “A One-shot Framework for Distributed Clustered Learning in Heterogeneous Environments’’ got accepted for publication at IEEE Transactions on Signal Processing.
 
