About
I am an All But Dissertation (ABD) PhD student at Carnegie Mellon University, advised by prof. Soummya Kar. Broadly, my interest lie in signal processing and machine learning. More specifically, I am interested in theoretical guarantees of both centralized and large-scale distributed learning systems. Recent projects included establishing learning guarantees in the presence of various notions of heterogeneity, e.g., statistical (non-IID distributions) (paper), or system heterogeneity (varying communication or computation capabilities) (paper), as well as learning guarantees under heavy-tailed noise (paper).
If you have any questions about my research, or believe our interests are synergic, feel free to reach out.
News
- January 2025 - Our paper “High-probability Convergence Bounds for Online Nonlinear Stochastic Gradient Descent Under Heavy-tailed Noise’’ got accepted at the International Conference on Artificial Intelligence and Statistics (AISTATS). See you in Thailand!
- January 2025 - Our paper “Distributed Center-based Clustering: A Unified Framework’’ got accepted at IEEE Transactions on Signal Processing.
- October 2024 - A new preprint is out. Title: “Large Deviations and Improved Mean-squared Error Rates of Nonlinear SGD: Heavy-tailed Noise and Power of Symmetry’’.
- October 2024 - A new preprint is out. Title: “Nonlinear Stochastic Gradient Descent and Heavy-tailed Noise: A Unified Framework and High-probability Guarantees’’.
- February 2024 - A new preprint is out. Title: “A Unified Framework for Gradient-based Clustering of Distributed Data’’.
- December 2023 - Our paper “A One-shot Framework for Distributed Clustered Learning in Heterogeneous Environments’’ got accepted at IEEE Transactions on Signal Processing.
- October 2023 - A new preprint is out. Title: “High-probability Convergence Bounds for Nonlinear Stochastic Gradient Descent Under Heavy-tailed Noise’’.