News

  • Sep 2023: Our papers on Wasserstein stability bound for SGD and sampling using PSD model have been accepted to Neurips 2023.

  • Jul 2023: Presented our work on stability bound for heavy-tailed SGD at ICML 2023 in Hawaii, USA.

  • Jul 2023: Presented our work on sampling at COLT 2023 in Bangalore, India.

  • Jul 2023: Visited Indian Institute of Science, Bangalore, India.

  • Jun 2023: Our paper on variational principle for mirror descent and mirror Langevin has been accepted to IEEE control system letters and CDC 2023.

  • Jun 2023: New preprint on Wasserstein Stability Bounds for (Noisy) Stochastic Gradient Descent is available now. Check it out.

  • Mar 2023: New preprint on sampling from SDEs using PSD model is out. Check it out.

  • Mar 2023: New preprint on Variational Principle of Mirror Descent and Mirror-Langevin is available now. Check it out.

  • Feb 2023: New preprint on Stochastic Gradient based Sampling is available now. Check it out.

  • Jan 2023: New preprint on algorithmic stability for heavy tailed SGD for general loss functions is available now. Check it out.

  • Jan 2023: Our paper on explicit regularisation via noise injection has been accepted to AISTATS 2023.

  • Dec 2022: Our paper on algorithmic stability for heavy-tailed SGD has been accepted to ALT, 2023.

  • Aug 2022: Our paper on causal feature selection has been accepted to TMLR, 2022.

  • July 2022: Presented our work on active learning at ICML, 2022.

  • July 2022: Visited Google Research, India and Indian Institute of Science, Banglore.

  • Jun 2022: New preprint on explicit regularization via noise injection is available now. Check it out.

  • Jun 2022: New preprint on algorithmic stability for heavy tailed continuous time SGD is available now. Check it out.

  • May 2022: Our paper on active learning has been accepted to ICML, 2022. Check out the preprint here.

  • Apr 2022: Presented our paper on faster rates for linear compositional optimization at AISTATS, 2022. Check out the paper.

  • Mar 2022: Have started at University of Illinois, Urbana-Champaign.

  • Jan 2022: Gave a talk on our work on faster rates for linear compositional optimization at IST, Austria (virtual).

  • Jan 2022: Our paper on faster rates for linear compositional optimization has been accepted at AISTATS 2022 for oral presentation. Check out the paper.