Xiangyu Chang - Provable Benefits of Overparameterization in Model Compression

preview_player
Показать описание
Contributed talk at the Workshop on the Theory of Overparameterized Machine Learning (TOPML) 2021.
Speaker: Xiangyu Chang (UC Riverside)
Talk title: Provable Benefits of Overparameterization in Model Compression: From Double Descent to Pruning Neural Networks

Рекомендации по теме