Updated Oobabooga Textgen WebUI for M1/M2 [Installation & Tutorial]

preview_player
Показать описание
In this video I will show you how to install the Oobabooga Text generation webui on M1/M2 Apple Silicon. We will also download and run the Vicuna-13b-1.3 version model.

▬▬▬▬▬▬▬▬▬▬▬▬▬▬ CONNECT ▬▬▬▬▬▬▬▬▬▬▬
▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬
LINKS:

▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬
All Interesting Videos:

Рекомендации по теме
Комментарии
Автор

Do more apple silicon tutorials. Great work.

DoctorCulture
Автор

🎯 Key Takeaways for quick navigation:

00:00 🖥️ The tutorial explains how to install the Oobabooga Text Generation Web UI on a personal machine for open-source language model applications.
00:26 🍏 This tutorial focuses on Mac OS installation, specifically on Apple silicon machines.
01:46 👨‍💻 Cloning the GitHub repository and creating a custom name for the folder is the first step in the installation process.
02:26 🛠️ Next, a new virtual environment is created with Python 3.10.10.
05:52 📚 After installing required packages, an Apple silicon-compatible PyTorch version is added.
08:34 📈 Update on the tutorial covers downloading various models and handling model-specific options.
10:12 ❗ Detailed tutorial also includes how to switch to chat interface or use different Prompt templates.
13:55 🧪 At the end, the tutorial provides a real-time test of the implemented model, yielding instant conversation generation.

Made with Socialdraft

AntonioEvans
Автор

why is there always error when downloading required packages: note: This error originates from a subprocess, and is likely not a problem with pip.
error: metadata-generation-failed

menglinliu
Автор

RuntimeError: MPS backend out of memory (MPS allocated: 17.64 GB, other allocations: 498.71 MB, max allowed: 18.13 GB). Tried to allocate 50.00 MB on private pool. Use to disable upper limit for memory allocations (may cause system failure). What should I do?

AIFemmeFatale
Автор

hi i really need to ask. how do i run the program again after i quit it?

pekoemo
Автор

when will it be possible for textgen webui to be installed in AMD on windows?

wessfripes
Автор

dude, your explanation is excellent. For those who are at the intermediate or new beginner level like me, you interpret the topics you have explained so well that we can do it without any difficulty. I have searched a lot, but there is no one who explains these technologies as simply and clearly as you. I hope you continue. I think you'll be a wanted man after a while if you continue like this. I am grateful to you

rehberim
Автор

I got it working on M1, thanks for the guide its awesome! thing is on M1 is seems to be incredibly slow, many times timing out, its this a limitation of the M1 chip or is there something I can do to optimise it>? Just in the process of installing on a PC with Ryzen 5 7600X 6 with a RTX 3080 to see how it compares.

BBK.
Автор

for weeks I tried to get Oobabooga installed on my M1 Pro. Thank you very much for your great work. Always got the "autogptq module not found"-error

dfas
Автор

followed instructions - line by line "ValueError: The current `device_map` had weights offloaded to the disk. Please provide an `offload_folder` for them. Alternatively, make sure you have `safetensors` installed if the model you are using offers the weights in this format.'

michaeldoyle
Автор

Running the same model on my macbook m1 pro and with the same settings as in the video.
The token generating speed is 0.03/s ... Is it gpu accelerated?

TimSheung-ch
Автор

Hi. Do you have a tutorial showing how to install this on a 2015 Mac with Intel chip? I can't get it to work. Thanks.

mastolle
Автор

Ive installed this on all of my M1s :)

rgm
Автор

When trying to install requirements.txt, I get the following error: ERROR: Ignored the following versions that require a different python version: 1.6.2 Requires-Python >=3.7, <3.10; 1.6.3 Requires-Python >=3.7, <3.10; 1.7.0 Requires-Python >=3.7, <3.10; 1.7.1 Requires-Python >=3.7, <3.10
ERROR: Could not find a version that satisfies the requirement autoawq==0.1.4 (from versions: none)
ERROR: No matching distribution found for autoawq==0.1.4

any ideas how this can be sorted? autoawq seems to be incompatible with Python 3.10?

logicfigures
Автор

Is it stopping at loading checkpoint shards for anyone else?

MarcusClark-mp
Автор

It’s 5 months later is the installer working now? I’m not a Terminal user

KainsTorment
Автор

I was right there with you til you started talking about conda. Great job explaining for the noob like myself but I am gonna have to figure out where and what conda is before I can continue with this tutorial.

MobileGamingLegendz
Автор

Is it possible to do this using Docker on MacOS?

trezero
Автор

no luck the same error again and again: python -m pip install -r requirements.txt

ERROR: Failed building wheel for llama-cpp-python
ERROR: Could not build wheels for llama-cpp-python, which is required to install pyproject.toml-based projects

gileneusz
Автор

it says: "RPC failed; curl 92 HTTP/2 stream 5 was not closed cleanly: CANCEL (err 8)" anyone can help me?🥲

howchingcai