Advanced Hyperparameter Optimization for Deep Learning with MLflow - Maneesh Bhide Databricks

preview_player
Показать описание
Building on the "Best Practices for Hyperparameter Tuning with MLflow" talk, we will present advanced topics in HPO for deep learning, including early stopping, multi-metric optimization, and robust optimization. We will then discuss implementations using open source tools. Finally, we will discuss how we can leverage MLflow with these tools and techniques to analyze the performance of our models.

About: Databricks provides a unified data analytics platform, powered by Apache Spark™, that accelerates innovation by unifying data science, engineering and business.

Connect with us:
Рекомендации по теме
Комментарии
Автор

I like the idea of including test time. It is a surrogate for model complexity, and lower complexity is more regularised, which is why you can expect better holdout set accuracy.

LukaszWiklendt
welcome to shbcf.ru