About
I am an All But Dissertation (ABD) PhD student at Carnegie Mellon University, advised by prof. Soummya Kar. Broadly, my interest lie in signal processing and machine learning. More specifically, I study theoretical guarantees of both centralized and large-scale multi-agent learning systems, in the presence of phenomena such as heavy-tailed noise and statistical heterogeneity between different users’ datasets.
If you have any questions about my research, feel free to reach out.
News
- January 2025 - Our paper “High-probability Convergence Bounds for Online Nonlinear Stochastic Gradient Descent Under Heavy-tailed Noise’’ got accepted at the International Conference on Artificial Intelligence and Statistics (AISTATS).
- January 2025 - Our paper “Distributed Center-based Clustering: A Unified Framework’’ got accepted at IEEE Transactions on Signal Processing.
- October 2024 - A new preprint is out. Title: “Large Deviations and Improved Mean-squared Error Rates of Nonlinear SGD: Heavy-tailed Noise and Power of Symmetry’’.
- October 2024 - A new preprint is out. Title: “Nonlinear Stochastic Gradient Descent and Heavy-tailed Noise: A Unified Framework and High-probability Guarantees’’.
- February 2024 - A new preprint is out. Title: “A Unified Framework for Gradient-based Clustering of Distributed Data’’.
- December 2023 - Our paper “A One-shot Framework for Distributed Clustered Learning in Heterogeneous Environments’’ got accepted at IEEE Transactions on Signal Processing.