Anant Raj

Hello 

Marie-Curie Fellow
SIERRA Project Team (Inria),
Ecole Normale Supérieure, PSL Research University.
Coordinated Science Laboratory (CSL),
University of Illinois at Urbana-Champaign (UIUC).

Email: araj at illinois and anant dot raj at inria
Contact: Room 121, CSL, Urbana.

Short Bio

I am currently a Marie-Curie Fellow, jointly hosted by Prof. Francis Bach at SIERRA Project Team (Inria) and Prof. Maxim Raginsky at Coordinated Science Laboratory (CSL), UIUC. Before that, I completed my PhD in Machine Learning at Max-Planck Institute for Intelligent Systems in the Empirical Inference Department under the supervision of Prof. Bernhard Schoelkopf. I received my B.Tech-M.Tech dual degree in Electrical Engineering from IIT Kanpur where my research was advised by Prof. Rajesh M Hegde, Prof. Vinay Namboodiri, Prof. Amitabha Mukerjee and Prof. Tinne Tuytelaars (External Master's Thesis Advisor). My research interest is in understanding problems in general machine learning theory and applications. More specifically, I am interested in optimization theory, kernel methods and theoretical foundation of machine learning. I have also a vivid interest in understanding the resource efficient learning such as active learning, coresets and distributed inference. On the application side, I am interested in application of machine learning methods in healthcare domain.

Recent News

  • Sep 2023: Our papers on Wasserstein stability bound for SGD and sampling using PSD model have been accepted to Neurips 2023.

  • Jul 2023: Presented our work on stability bound for heavy-tailed SGD at ICML 2023 in Hawaii, USA.

  • Jul 2023: Presented our work on sampling at COLT 2023 in Bangalore, India.

  • Jul 2023: Visited Indian Institute of Science, Bangalore, India.

  • Jun 2023: Our paper on variational principle for mirror descent and mirror Langevin has been accepted to IEEE control system letters and CDC 2023.

  • Jun 2023: New preprint on Wasserstein Stability Bounds for (Noisy) Stochastic Gradient Descent is available now. Check it out.

  • Mar 2023: New preprint on sampling from SDEs using PSD model is out. Check it out.

  • Mar 2023: New preprint on Variational Principle of Mirror Descent and Mirror-Langevin is available now. Check it out.

  • Feb 2023: New preprint on Stochastic Gradient based Sampling is available now. Check it out.

  • Jan 2023: New preprint on algorithmic stability for heavy tailed SGD for general loss functions is available now. Check it out.