Talks and Teaching

Talks given and courses taught.

Talks

Major Talks

  1. A Framework for Partially-Observed Reward States in RLHF, UT Austin, 2024.
  2. Learning Mixtures of Markov Chains and MDPs, ICML Oral Talk, 2023.
  3. Holomorphic Differentials, Measured Foliations and the Hubbard-Masur Theorem. Undergraduate thesis defence, 2021. Slides
  4. An Algorithm for the Generalized Seifert Matrix. Regensburg internal seminar, 2020. Slides. The algorithm has been simplified since.

Minor Machine Learning Talks

  1. A Tour of Practical RLHF Methods, CASI Reading Group, 2024.
  2. End-to-End Retrieval Augmented Generation, EECS 598, 2024.
  3. Bayesian Design Principles for Frequentist Sequential Decision-Making, CASI Reading Group, 2023.
  4. Minimax Optimal Offline Evaluation for Multi-Armed Bandits, CASI Reading Group, 2023.
  5. Learning Mixtures of Linear Dynamical Systems, CASI Reading Group, 2022.

Teaching

University of Michigan

The sole instructor for a section of 15-20 students.

  1. Math 116 (Calculus II) - Winter 2022, Fall 2022.
  2. Math 115 (Calculus I) - Fall 2021.

Monsoon Math Camp

Cut, Fold, Paste: Homology and the Classification of Surfaces, 2021 Description: How many truly different shapes can you get by pasting polygons at their boundaries? Mathematicians often like to “classify objects” - many big research endeavours in mathematics are geared towards classification. What does it mean to classify a mathematical object?
This will be illustrated using the “classification of surfaces,” which is related to our first question. We will try to understand how a surface might be defined from our intuitive idea of it, reduce it to a combinatorial object and then classify these combinatorial objects using tools we develop on the way. We will see powerful tools like the Euler characteristic and homology.
Prerequisites: Comfort with the ideas of sets, functions and induction. Visual intuition and familiarity with the notion of a graph will be very helpful.
Verdict: Lower prerequisites and a gentler pace than my 2020 course. Far more successful than that one. Avoided the rookie mistake of doing something too advanced for students. Students were following well enough to be able to point out minor/some not-so-minor errors. I had to work out the details of the combinatorial version of a smooth argument made by Mike Miller in a blogpost, which was also fun for me.
Geometry, Symmetry and Hyperbolic Space, 2020 Description: Exposure to a lot of Euclidean geometry may create the impression that higher geometry is the study of generalized distance spaces. This course will try to convince participants that in some cases, a better view of geometry is the interaction between a space and its group of transformations, via material on elementary hyperbolic geometry. We will see basic results in hyperbolic geometry, the hyperbolic Gauss-Bonnet Theorem, the Iwasawa decomposition, a quick version of material on Fuchsian groups and quotienting, and if time permits, the Milnor-Svarc lemma.
Prerequisites: High School Calculus and High School Matrices.
Verdict: The course was quite challenging and only about half the class followed it to the end.