Kristian Georgiev

S.B. MIT '21

I am a first-year PhD student at the TOC group at MIT advised by Samuel Hopkins and Aleksander Mądry. My academic interests are at the intersection of machine learning, theoretical computer science, and optimization.

Previously, I obtained a joint degree in Mathematics and Computer Science at MIT, supported in part by the Marie G Dennett Foundation. During that time, I was fortunate to be advised by Prof. Ozdaglar as a SuperUROP scholar working on equivariance in deep learning, as well as by Prof. Mądry, working on adversarial robustness. I have interned at QuantCo and Apple.

I grew up in Bulgaria. In my free time, I enjoy playing soccer, volleyball, and chess. At MIT, I have led the AI@MIT club.

• Website last updated on

Projects

Below is a mix of class projects and stuff just for fun.
Read full paper | View slides

In this work we analyze the behaviour of the last iterate of the Extragradient (EG) and Proximal Point (PP) algorithms on smooth variationally coherent problems1 . In particular, we examine a relaxation of the conditions in Golowich et al. (2020), who show a rate of $$\mathcal{O}\left( 1 / \sqrt{T} \right)$$ for both algorithms when the problem is convex- concave. We show that best iterate convergence at a rate of $$\mathcal{O}\left( 1 / \sqrt{T} \right)$$ naturally carries over to to variationally coherent problems but key monotonicity properties are lost.

Golowich, N., Pattathil, S., Daskalakis, C., and Ozdaglar, A. Last iterate is slower than averaged iterate in smooth convex-concave saddle point problems, 2020.

Zhou, Z., Mertikopoulos, P., Bambos, N., Boyd, S., and Glynn, P. W. Stochastic mirror descent in variationally coherent optimization problems, NeurIPS 2017.

Read full paper | Visit repository

Gradient based meta-learning has established itself as a promising research direction for prob- lems in few-shot learning: data-constrained training on an ensemble of tasks with quick adapta- tion on a task, unseen during training. Until recently, little has been known about the success of model-agnostic meta-learners (MAMLs), which are particularly useful to few shot learning. We shed light on the phenomenon of feature reuse in MAMLs through empirical visualizations of the loss landscapes of a variety of tasks. We develop meta-visualization: an efficient framework for visualization of any gradient based meta learning algorithm that can be used to generate hypotheses and guide model designs. The contributions of our work vary from augmenting research in the field of meta learning to explaining the success of the method through geometrical lens.

Read full paper | View slides

We explore the hypothesis that robust training helps to disentangle representations by looking at the behavior of individual units of a robustly trained neural network through network dissection. We observe that to a large extent, robust models explain each high-level concept with fewer units than the equivalent non-robust models.

The minimum $$k$$-cut problem is a natural generalization of the famous minimum cut problem, where we seek a cut that partitions a graph $$G(V,E)$$ into $$k$$ components. Karger et al. (1996) develop a randomized algorithm and provide an analysis which shows their algorithm provides a solution with high probability in $$\tilde{\mathcal{O}}\left( |V|^{2(k-1)} \right)$$ time. A reduction to the maximum-weight clique problem gives a lower bound of $$\mathcal{\Omega}\left( |V|^{(1-o(1))k} \right)$$. Until recently, the gap in the exponent was not explained. Confirming previous experimental results, Anuram Gupta et al. (2020) provide a refined analysis of the Karger-Stein algorithm that shows it is tight up to lower order terms. We survey the techniques employed in their analysis, together with other recent analyses of algorithms of similar flavour; this includes ideas from extremal set theory and stochastic processes.

Anupam Gupta et al. “Optimal Bounds for the k-cut Problem”. In: arXiv preprint arXiv:2005.08301 (2020).

David R. Karger and Clifford Stein. “A New Approach to the Minimum Cut Problem”. In: J. ACM 43.4 (July 1996), pp. 601–640.

Wikipedia, Karger's Algorithm https://en.wikipedia.org/wiki/Karger%27s_algorithm

Continuous time Markov processes are heavily used to approximate solutions to parabolic and elliptic partial differential equations. In this paper, we examine the role of the Feynman-Kac formula in the aforementioned approximation. We present the classical results for linear parabolic equations. Then we generalize the method for stochastic processes with discrete jump discontinuities. Via Fokker-Planck's equation we show that this class of SDEs is the codomain of a more general class of partial integro-differential equations under Feynman-Kac's transformation. We utilize a jump-adapted Euler-Maruyama scheme to approximate the evolution of jump-diffusions. Finally, we analyze the weak and strong convergence properties of the implemented methods.

(In Proceedings of the Forty-sixth Spring Conference of the Union of Bulgarian Mathematicians 2017.)
The paper deals with finding unions of lines in $$\mathbb{F}^n_{p^{2k}}$$ which obey the Wolff axiom and have minimal size. We provide an extension of the constructions for $$\mathbb{F}^3_{p^{2k}}$$, obtained by Tao in 2002, to a construction for $$\mathbb{F}^n_{p^{2k}}$$. We determine the size of the union of lines in our construction in $$\mathbb{F}^n_{p^{2k}}$$ to be $$\mathcal{O}\left(p^{1.6kn}\right)$$. We prove that our construction obeys the Wolff axiom up to a heuristically negligible number of lines.

The project belongs to the field of topology. More specifically, the identification (gluing) of the edges of $$2n$$-gons is analyzed. We prove that the resulting manifold is a topological surface. A classification of compact surfaces is presented. A conjecture regarding the number of ways to obtain an arbitrary manifold from $$2n$$-gon is constructed. The author’s contributions consist of constructing a conjecture regarding the number of ways to obtain an arbitrary manifold from $$2n$$-gon, as well as computing the number of configurations of the edges of a $$2n$$-gon forming a sphere, a torus, a $$\mathbb{R}P^2$$ and a Klein bottle, thus proving the conjecture true for the mentioned manifolds.

I started developing this project in the summer of 2014 at HSSIMI - a summer research school held at Blagoevgrad, Bulgaria. I worked under the guidance of Katerina Velcheva from Stanford. Afterwards, I continued working on the topic under the guidance of Prof. Kopanov from Plovdiv University. The project was presented at the 27th edition of EUCYS, the largest science fair for high school students in Europe.

Go to Chrome store | Visit repository

Lockdown motivated me to make an app that makes workouts more fun. HIITify is a an app that lets you customize your favorite Spotify songs to match your HIIT workout with custom BPMs for workout and rest intervals.

Ever wanted to listen to your favorite Spotify mix during your HIIT (High Intensity Interval Training) training to have the perfect workout? HIITify is a Spotify extention that allows you to:
• set your desired workout tempo (BPM) and maintains it on all of your songs during the workout
• set the length of each workout and rest interval
• have beep cues for the start and end of each interval
• manually update your song speed to a speed of your choice at any time.
Go to web app | Visit repository

I play an unhealthy amount of foosball. And so do a lot of my friends. Soon the inevitable question arises - Who is the best? To answer that, Maddy and I created FoosRank - a ranking system for foosball games. The algorithm is a slight modification of the ELO system, used in chess among others. FoosRank also keeps the score during the game and shows each user their game history and ranking over time.

I'm very happy that the app caught on and 20+ people have recorded 500+ games on the platform since we made it!

A smart form-parser to help visually impaired people fill out forms.

No one likes filling out forms, but the routine tasks of filling out taxes or visa applications are especially hard for the visually impaired. As a result, we came up with an idea of a smart scanner application that would save the time and energy of people who wouldn’t be able to read the forms on their own.

We created a scanner Android application that takes a photo of the form, crops the image so that the 4 edges of the paper match the 4 edges of the camera (which would accommodate people who can’t see well enough to perfectly line up the edges of the paper with the edges of the camera), and uses an algorithm we developed to detect the locations of each text box that needs to be filled in. Next, it verbally prompts the user (who can also use the UI buttons to go to the previous text box, the next text box, or print out the form) as it goes through each textbox one by one and saves their answers, eventually returning a PDF of the form with all the text boxes filled in with user input to be emailed / printed out. For our purposes, we wanted to code a smartphone application because we felt like the convenience of having your phone be able to read text and fill out forms for you would be very appealing.

In my freshman year, I tried automating some mundane tasks like playing music on different platforms, maintaing alarms, calendars and to-do lists, and controlling the lights and monitors. I found an oldish PC that MIT was about to throw out, and turned it into a media server that constantly listens for voice commands. Having coded for an entire month or so by then, I was certain that it would be ez to make everything from scratch. Soon, my bravery stemming from Dunning-Kruger's Mount Stupid was not enough to complete the task, and I resorted to using the pocketsphinx library for the voice recognition part.

Teaching

I have served as a course assistant for the following classes.

• Introduction to Deep Learning (6.S191) teaching assistant IAP (Winter) 2020 IAP (Winter) 2021
• Introduction to Inference (6.008) lab assistant Fall 2019
• Introduction to Machine Learning (6.036) lab assistant Spring 2019
• Abstract Algebra I (18.701) grader Fall 2018

Coursework

These are some of my favorite classes.

• Machine Learning (6.867) Graduate CS Fall 2019
• Optimization for Machine Learning (6.881) Graduate CS Spring 2020
• Topics in Deployable Machine Learning (6.S979) Graduate CS Fall 2019
• Fundamentals of Probability (6.436) Graduate Math Fall 2018
• Mathematical Statistics (18.655) Graduate Math Spring 2019