Deploy Transformer Models in the Browser with #ONNXRuntime

preview_player
Показать описание
In this video we will demo how to use #ONNXRuntime web with a distilled BERT model to inference on device in the browser with #JavaScript. This demo is based on the amazing work of our community member Jo Kristian Bergum!

#machinelearning #transformers #pytorch #onnx #onnxruntime #JavaScript #web
Рекомендации по теме
Комментарии
Автор

This instructor is actually an angel, thank you Madam for the straightforward tutorial!

seikatsu_ki
Автор

Nice. How to convert it to onnx using cuda?

Gerald-izmv
Автор

there is a bug-
InvalidArgument: [ONNXRuntimeError] : 2 : INVALID_ARGUMENT : Unexpected input data type. Actual: (tensor(int32)), expected: (tensor(int64))

amanbishnoi
Автор

This is not "in the browser". This is still node.js. That's server technology.

KrisMerckx