filmov
tv
Xiangyu Chang - Provable Benefits of Overparameterization in Model Compression
![preview_player](https://i.ytimg.com/vi/Ol-hIvJeTG4/maxresdefault.jpg)
Показать описание
Contributed talk at the Workshop on the Theory of Overparameterized Machine Learning (TOPML) 2021.
Speaker: Xiangyu Chang (UC Riverside)
Talk title: Provable Benefits of Overparameterization in Model Compression: From Double Descent to Pruning Neural Networks
Speaker: Xiangyu Chang (UC Riverside)
Talk title: Provable Benefits of Overparameterization in Model Compression: From Double Descent to Pruning Neural Networks