filmov
tv
Create Streamlit data validation app with Deepcheckes test suites and pandas-profling reports
![preview_player](https://i.ytimg.com/vi/0mVZhEv9Xo0/maxresdefault.jpg)
Показать описание
In this hands-on-lab which can be a good base line tutorial for streamlit and deepchecks users, we are rendering deepchecks test suites into the Streamlit application. The objective of this tutorial is to assist other python developers to build a full fledge python application to display DeepChecks reports.
Deepchecks Open Source is a python library for data scientists and ML engineers. The package includes extensive test suites for machine learning models and data, built in a way that’s flexible, extendable and editable. Deepchecks test suites are composed of checks. Each check contains outputs to display in a notebook and/or conditions with a pass/fail output.
Hands-on-lab content timeline:
----------------------------------------------
- (00:00) Hands-on Lab Starts
- (00:08) Content Intro
- (01:06) Complete Application Demo
- (03:33) Deepchecks and previous Tutorial
- (04:37) Pandas and streamlit-pandas profiling
- (05:39) Streamlit coding starts
- (10:14) Deepchecks sample script
- (11:10) Streamlit file selected/uploader
- (16:08) Profile and Validation choice menu
- (22:48) Pandas-profiling implemented
- (23:58) Deepchecks test integration
- (31:55) Deepchecks test validation added
- (37:44) Test results JSON filtering
- (48:07) Code and Functionality Recap
- (49:46) Code push to GitHub
- (50:46) Credits
Deepchecks GitHub:
Source Code in this hands-on Lab:
Please visit:
------------------
Tags:
#ai #aicloud #h2oai #driverlessai #machinelearning #cloud #mlops #model #collaboration #deeplearning #modelserving #modeldeployment #keras #tensorflow #pytorch #datarobot #datahub #aiplatform #aicloud #modelperformance #modelfit #modeleffect #modelimpact #bias #modelbias #modeldeployment #modelregistery #modelpipeline #neptuneai #librosa #pythondsp #pythonaudio #spotify #spotipy #streamlit #pythonapps #deepchecks #modeltesting
Deepchecks Open Source is a python library for data scientists and ML engineers. The package includes extensive test suites for machine learning models and data, built in a way that’s flexible, extendable and editable. Deepchecks test suites are composed of checks. Each check contains outputs to display in a notebook and/or conditions with a pass/fail output.
Hands-on-lab content timeline:
----------------------------------------------
- (00:00) Hands-on Lab Starts
- (00:08) Content Intro
- (01:06) Complete Application Demo
- (03:33) Deepchecks and previous Tutorial
- (04:37) Pandas and streamlit-pandas profiling
- (05:39) Streamlit coding starts
- (10:14) Deepchecks sample script
- (11:10) Streamlit file selected/uploader
- (16:08) Profile and Validation choice menu
- (22:48) Pandas-profiling implemented
- (23:58) Deepchecks test integration
- (31:55) Deepchecks test validation added
- (37:44) Test results JSON filtering
- (48:07) Code and Functionality Recap
- (49:46) Code push to GitHub
- (50:46) Credits
Deepchecks GitHub:
Source Code in this hands-on Lab:
Please visit:
------------------
Tags:
#ai #aicloud #h2oai #driverlessai #machinelearning #cloud #mlops #model #collaboration #deeplearning #modelserving #modeldeployment #keras #tensorflow #pytorch #datarobot #datahub #aiplatform #aicloud #modelperformance #modelfit #modeleffect #modelimpact #bias #modelbias #modeldeployment #modelregistery #modelpipeline #neptuneai #librosa #pythondsp #pythonaudio #spotify #spotipy #streamlit #pythonapps #deepchecks #modeltesting