filmov
tv
How to Run PyTorch Models in the Browser With ONNX.js
Показать описание
Run PyTorch models in the browser with JavaScript by first converting your PyTorch model into the ONNX format and then loading that ONNX model into your website or app using ONNX.js. In this video, I take you through this process by building a handwritten digit recognizer that runs in the browser.
Live demo:
The demo sandbox code:
The GitHub repo:
I opened these issues for the bugs mentioned in the video if you want to track their status:
The benefits of running a model in the browser:
• Faster inference times with smaller models.
• Easy to host and scale (only static files).
• Offline support.
• User privacy (can keep the data on the device).
The benefits of using a backend server:
• Faster load times (don't have to download the model).
• Faster and consistent inference times with larger models (can take advantage of GPUs or other accelerators).
• Model privacy (don't have to share your model if you want to keep it private).
Join our Discord community:
Connect with me:
🎵 Kazukii - Return
Live demo:
The demo sandbox code:
The GitHub repo:
I opened these issues for the bugs mentioned in the video if you want to track their status:
The benefits of running a model in the browser:
• Faster inference times with smaller models.
• Easy to host and scale (only static files).
• Offline support.
• User privacy (can keep the data on the device).
The benefits of using a backend server:
• Faster load times (don't have to download the model).
• Faster and consistent inference times with larger models (can take advantage of GPUs or other accelerators).
• Model privacy (don't have to share your model if you want to keep it private).
Join our Discord community:
Connect with me:
🎵 Kazukii - Return
Комментарии