How to convert almost any PyTorch model to ONNX and serve it using flask

preview_player
Показать описание
In this video, I show you how you can convert any #PyTorch model to #ONNX format and serve it using flask api.
I will be converting the #BERT sentiment model that we built in previous videos.

Please subscribe and like the video to help me keep motivated to make awesome videos like this one. :)

Follow me on:
Рекомендации по теме
Комментарии
Автор

Thank you for all the videos and all the PyTorch gyan :)

oostopitre
Автор

Nice video, it helped alot. I follow you everywhere . thanks for this Abhishek Bhai

vikasbharadwaj
Автор

The video is really helpful but I already have the fine-tuned model (PEGASUS) on colab so is it going to be any different from this scenario?
Note: the model is fine-tuned using huggingface

karimfayed
Автор

hello Thakur, thanks for your awesome video, but i have one problem when i converting my models, It's a bert-like model, when i attempt to convert it, here raised an error, "RuntimeError: Unsupported: ONNX export of Slice with dynamic inputs. DynamicSlice is a deprecated experimental op." could you tell me what that mean and what should i do to handle this? Thank you a lot!

houdoffe
Автор

If my pytorch model has a dictionary as output, with keywords 'classification' and 'regression', can I export it to .onnx? Thanks!

lettuan
Автор

Hello abhishek, good stuff. Can you take on a kaggle competetion using tensorflow please end-end flow like the one you did for pytorch(Bengali Handwritten Dataset). That would be super helpful. btw excellent stuff.

nikhilkumar
Автор

Thanks for another good informative video. Which Pytorch extension do u use. It seems very good.

arunmohan
Автор

I have been trying to convert a longformer question answering model (which I trained) to onnx. I am, unfortunately, getting the following error: "RuntimeError: Only consecutive 1-d tensor indices are supported in exporting aten::index_put to ONNX." I haven't been able to either understand it or get around it, can someone shed some light as to how this can be fixed?

syedhasany
Автор

Thanks for yet another applied ML video. Wondering if we can get batch predictions from model served on flask?

rohitupadhyay
Автор

After this video I thought of giving it a try., but it didn’t work and we suffered a lot. We are working on text generation problem.

Could anyone help ?

sachinkalsi
Автор

So... with ONNX, we don't need to use jit.trace for logic in Pytorch?

jonathansum
Автор

Good!How do I convert the PT model to TensorRT engine?

dptian
Автор

If the run time for Pytorch on CPU and ONNX is almost the same, what is the point of converting?

ZhiboXiao
Автор

Bahi mera kaam sirf video like karna rah gaya hai. Samujh me to aane se raha ;-/

amandarash