Beyond worst-case analysis: Perturbation stability

preview_player
Показать описание
Often, we consider cases that don't matter. Here, Nina and I talk about perturbation stability – a (rather strict) definition that helps us consider the cases "that do matter".

0:00 - Intro
5:26 - Extremely smooth transition
5:28 - Clustering and perturbation stability
15:35 - Another extremely smooth transition
15:37 - Sketching min cut iwth LP
Рекомендации по теме
Комментарии
Автор

Hi Benson and Nina,

Big fan of both of your previous work! I really loved the contextualization and framing of this (especially referencing other Stanford classes, I appreciated the 221 shoutout as a CA of that class). Your slides were also very clear, and I really liked that you drew into them even though they were pre-made as well -- this was a nice middle ground. I also think your integration of previous papers into your presentation was very smooth -- something I struggled with mightily personally. Your algorithmic explanation was also excellent -- the walkthrough with an example was particularly illuminating. Really cool to see 261 concepts presented in such a cool way -- don't tell Ashish, but I prefer your explanations to his at parts. Amazing work to you both! I would love to see your paper!

JonathanKo
Автор

Great work! I really liked the effort made to motivate the reasons behind these definitions. There was great emphasis on the idea that the instances we care about in practice satisfy this "perturbation stability" property and therefore when thinking about real-world instances, we don't need be so concerned with worst-case hardness lower bounds. I am very curious to read more about the multi-way min-cut linear programming solution. Good job building interest on that part without having enough time to actually explain the setting. The excitement presented was enough to keep my interest. At some times I struggling keep up with all of the details being shared, particularly with the k medians algorithm, but the frequent return to the big picture takeaways helped to reground the talk even if some parts went over my head. Thank you! Forgot to mention, particularly liked the perceptron example at the top -- motivates the idea that ruling things out due to worst case lower bounds can be premature.

JoeyBenjaminRivkin
Автор

Hi Nina and Benson!

Your presentation was excellent, with a strong outline and a clear emphasis on why we are adding this definition, which is often overlooked when diving into a new problem. The focus on the "why" was particularly effective, using the idea of a perceptron to motivate the problem. It might have been beneficial to also mention the usefulness of 'average-case' analysis. In discussing perturbation stability, the presentation could have been enhanced by including a formal definition (such as noise vs. measure) before building intuition, though this might be a personal preference.


Nina’s explanation of the clustering problem was great, especially in highlighting the importance of choosing a good ‘distance function’ to discover interesting data structures. The slides were very visually appealing and effectively presented key takeaways from papers without being too dense, which excellently motivated the concept of perturbation stability. However, more time could have been spent on the definition of perturbation stability, particularly explaining why one clustering would be the clustering for all alpha-perturbed problems and why this clustering would be unique, as I found this detail rather unintuitive. Also, when discussing the algorithm, it might have been helpful to use a laser pointer, or separate slides for different ideas. Despite these minor suggestions, your explanation of the algorithm and its effectiveness was clear and engaging, with just the right amount of abstraction, and the example provided solidified my understanding. A small suggestion would be to give a sense of the current state-of-the-art in the problems presented and how this ties in with the broader theme, perhaps through concluding remarks to tie the entire presentation together.

Lastly, Benson’s explanation of the multi-way min-cut problem was excellent, and he provided a great method to justify using optimization-based approaches in “problems we care about”.


Overall, it was a great video with a compelling explanation and engaging delivery. The presentation flowed well, with intuitive definitions and a clear narrative. I enjoyed both Benson and Nina’s explanations of the algorithms and am excited to see further applications of perturbation stability!

MeenalGupta-llxw
Автор

Great talk! This video introduces a new concept for beyond the worst-case analysis, the perturbation stability, which is natural and intuitive. Then, the video provides several applications of this notion to fundamental clustering and graph problems. The hand-made slides are cute and well-crafted, and the presentation is smooth. Nevertheless, the technical part is a bit hard to follow, and it would be helpful to animate the slides (although I admit that this is hard for hand-made slides). Also, it's not very clear how the LP algorithm for solving the multi-way min cut problem or its analysis utilize the assumption that the instance is perturbation stable (it would be great if the authors could further explain as I'm super curious).

mingweiyang