A1111: nVidia TensorRT Extension for Stable Diffusion (Tutorial)

preview_player
Показать описание
nVidia TensorRT Extension is an official extension for Stable Diffusion WebUI which allows you to use the Tensor Cores in your nVidia GPU for image generation performance boost. During my testing on an RTX 4090, I observed somewhere from 20% to 50% performance improvement when using TensorRT. However, it does have its limitations and bugs. In this tutorial, I cover everything from how it works, to installing it, removing launching errors, and explaining the exact settings for the training of the TensorRT Engine Profile.

------------------------

Relevant Links:

Checkpoints

Python and Git:

------------------------

Dev Branch Commands:
git branch -a
git checkout dev

Error Fixing Commands:
pip uninstall tensorrt
pip cache purge
pip uninstall -y nvidia-cudnn-cu11
pip install onnxruntime
pip install colored
pip install onnx

Xformers Command:

PIP Upgrade Command:

Cuda 12.1 Torch 2.1.1 Cdmmand:

------------------------

Timestamps:

0:00 Intro.
0:58 About TensorRT.
3:52 A1111 Dev Branch Install.
7:39 Fixing Errors.
14:25 nVidia CP Setting.
15:27 RT Engine Configuration.
21:03 Exporting Process.
23:00 Testing with ADetailer.
Рекомендации по теме
Комментарии
Автор

It's installing Optimum on mine, and it's stuck for awhile now. What should I do?

Senpaix
Автор

Hi! How do you ignore the warning (DEPRECATION) you show at 10:11 minutes?
What keys do you press?

saraartemi
Автор

Your explanations are clear. In my case, installing tensorrt==9.0.1.post11.dev4 and xformers results in an automatic reinstallation of torch 2.1.2. And I lose compatibility with torchvision. It's not blocking I decided to install tensorrt==9.0.1.post12.dev4 and xformers 0.0.23. I also reinstalled torchvision==0.16.1+cu121. I have a question, where did you find the file that we see in the control area SD_unet=[TRT] sd_xl_base_1.0. THANKS

xyy
Автор

My TensorRT Tab is now missing, don't know whatsup with that....

uiane
Автор

the error comes with minute 13:16 fixing xformer ... this is the message: Installing collected packages: torch, xformers
Attempting uninstall: torch
Found existing installation: torch 2.0.1+cu118
Uninstalling torch-2.0.1+cu118:
Successfully uninstalled torch-2.0.1+cu118
ERROR: pip's dependency resolver does not currently take into account all the packages that are installed. This behaviour is the source of the following dependency conflicts.
torchvision 0.15.2+cu118 requires torch==2.0.1, but you have torch 2.1.1+cu121 which is incompatible.
Successfully installed torch-2.1.1+cu121 xformers-0.0.23


so we stuck..

vruser
Автор

How do i go about solving this after installing TensorRT the second time around.
To create a public link, set `share=True` in `launch()`.
Creating model from config:
Startup time: 7.5s (prepare environment: 1.6s, import torch: 2.7s, import gradio: 0.7s, setup paths: 0.7s, initialize shared: 0.2s, other imports: 0.4s, load scripts: 0.6s, create ui: 0.4s, gradio launch: 0.3s).
Applying attention optimization: Doggettx... done.
Model loaded in 2.5s (load weights from disk: 0.5s, create model: 0.2s, apply weights to model: 1.4s, load textual inversion embeddings: 0.2s, calculate empty prompt: 0.1s).
*** Error running install.py for extension
*** Command:
*** Error code: 1
*** stderr: Traceback (most recent call last):
*** File "C:\AI\sd.webui\webui\extensions\Stable-Diffusion-WebUI-TensorRT\install.py", line 3, in <module>
*** from importlib_metadata import version
*** ModuleNotFoundError: No module named 'importlib_metadata'

xBennyx
Автор

will this work with new cards only, or it can work with p40 tesla ?

gamalfarag
Автор

things went smothly but when i tried to install xformers(i have it already before but i wanted to make sure to have the latest one so i didn't unistall xformes i just put the command line for installing it) and after it when launching webui.bat, it show back errors x)

daemoniax
Автор

after doing the last step it went back to the original error boo...

FearfulEntertainment
Автор

i cant create one ~ i get this
Failed to parse ONNX model. Does the model file exist and contain a valid ONNX model?)

Heldn
Автор

To bad it does not work video video generations just yet

andrejlopuchov
Автор

Tensor RT extension is Available since 2 month

blender_wiki
Автор

WOW, you gain 15s ! lol not a big change tbh - that's why IO havent dable with it yet

cyril
Автор

i was following the tutorial everything was great until, when i installed xformer as i followed the video and the end of installing i notice, that he installed xformer but also upgraded py torch to 2.2.1 and uninstalled the previous 2.1.2 and said the 2.2.1 was not compatible, i then try the command u put a the bottom seeing it talked about torch2.1.2, it reinstalled the correct torch it seem, but then when i take the video back after modifying the webui-user.bat to include the -xformer tag in the command prompt windows, got this error : " Traceback (most recent call last):
File "C:\Stable2\webui\launch.py", line 48, in <module>
main()
File "C:\Stable2\webui\launch.py", line 39, in main
prepare_environment()
File "C:\Stable2\webui\modules\launch_utils.py", line 386, in prepare_environment
raise RuntimeError(
RuntimeError: Torch is not able to use GPU; add --skip-torch-cuda-test to COMMANDLINE_ARGS variable to disable this check, i'm not sure what to do now, i'll maybe retry following the video ...
i'm using Forge A1111, might be the case for those error ? don't need or want to use some extension with it ? i'm pretty new at this started not a week ago
i manage to correct giving it old for lib for torch vision audio and xformer, i found in an other video, and once it launch he detect the old lib in the terminal and give me command for more up to date lib than the one i had to put.

phenix
Автор

here we have some different packages .. mine are newer ... Requirement already satisfied: filelock in (from torch==2.1.1->xformers) (3.13.1)
Requirement already satisfied: typing-extensions in (from torch==2.1.1->xformers) (4.9.0)
Requirement already satisfied: sympy in (from torch==2.1.1->xformers) (1.12)
Requirement already satisfied: networkx in (from torch==2.1.1->xformers) (3.2.1)
Requirement already satisfied: jinja2 in (from torch==2.1.1->xformers) (3.1.2)
Requirement already satisfied: fsspec in (from torch==2.1.1->xformers) (2023.12.2)
Requirement already satisfied: MarkupSafe>=2.0 in (from (2.1.3)
Requirement already satisfied: mpmath>=0.19 in (from (1.3.0)
Installing collected packages: torch, xformers

vruser
Автор

But u already have 4090 n makes no difference. I think RT Tensor is more valuable for rtx 4070 n below who struggle with speed

relexelumna
Автор

Thanks for your simple explaining :)
in flag section FP 16, i searched alot to try to active it but i couldnt, i dont know how to do it, can you just help me with that if it is possible
thanks again ^^

mohamadelsawi
Автор

UserWarning: Failed to load image Python extension: '[WinError 127]

Brunobarros-phfw