I'm a second year MS student in Machine Learning at Carnegie Mellon University's School of Computer Science, developing scalable neural architectures for scientific computing and partial differential equations. I am currently working on resolution-invariant neural operator architectures and flow-based models for neural PDEs, advised by Prof. Andrej Risteski. My research focuses on developing scalable solutions for partial differential equation solving through adaptive multi-resolution training and conditional flow models with optimal transport.
Previously, I was a Research Assistant in the Department of Electrical and Communication Engineering at the Indian Institute of Science (IISc), advised by Prof. Prathosh A.P., where I focused on integrating geometric symmetries into neural networks. This work explored both discrete and continuous group symmetries, particularly for generative models and physics-based simulations. I also collaborated with Prof. Aditya Gopalan on unified symmetry learning frameworks for discrete groups.
My publications span top-tier venues including AISTATS, NeurIPS, and ICLR with several papers currently under review. You can find my complete publication list here.
I'm always interested in discussing research collaborations and new ideas. Feel free to reach out via email or connect with me on LinkedIn.
See my Research Overview page for more details on my research interests. You can find my latest CV here.