Macbook run FLUX locally - the free and open source model that beats Midjouney

preview_player
Показать описание
Step by step tutorial on running FLUX-dev (fp8) on Macbook. #macbook #flux #text2image
👉ⓢⓤⓑⓢⓒⓡⓘⓑⓔ

My Macbook spec: M3 Pro with 36GB RAM

Affiliate links: buy hardware on Amazon

Thank you for watching!
Рекомендации по теме
Комментарии
Автор

I use Flux-1 model on my MacBook Air 15" M2 with 8GB RAM, slowly but with gorgeous output, using DiffusionBee. Really impressive.

marcostuppini
Автор

Wow! I cannot thank you enough. I have been trying to install flux since it came out with zero success. I even started paying to use it through replicate. But I saw this tutorual. Deleted my whole comfy everything and rei-installed then used this tutorial and it worked. Thank you so much for this tutorial man.

INVICTUSSOLIS
Автор

another question, have you tried the FLUX controlnet and ipadapter ? not sure whether xlabs nodes can run on mac system

YING
Автор

Thanks for your video—simple and to the point.

Could you let me know which app you're using to monitor CPU, GPU, and RAM usage in the video?

Many thanks! 🙏🏻

chanmichael
Автор

My output is always a garbled noisy mess. Like hyperJPEG compression nothing is distinguishable from anything else.
Everything seems to work until I output something.

ThreeBooleans
Автор

Lol! More than 10 minutes! It's amazing!

miqlas
Автор

what version of PyTorch are you using? as my outputs are not being decoded properly and Ive read somewhere that's possibly to do with PyTorch

blackmaitreya
Автор

I'm on an M1 Ultra. The outputs are all dotted noise. Using Schnell.

jasongaretthatcher
Автор

After following your process, an error was prompted. like this{TypeError: Trying to convert Float8_e4m3fn to the MPS backend but it does not have support for that dtype.} What’s happening? Can you help me?

银色F猫怪
Автор

Hi 👋 How about M3 Max's unified memory (64GB, 128GB), does-it really works with big Generative AI image/video models ? big images? longer video sequences? is it more interesting than Nvidia RTX solutions? more efficient?

medboussouni
Автор

thank you. i get this error can you help me?

MPS backend out of memory (MPS allocated: 9.06 GB, other allocations: 384.00 KB, max allowed: 9.07 GB). Tried to allocate 27.00 MB on private pool. Use to disable upper limit for memory allocations (may cause system failure).

admintadbir
Автор

I keep getting this error, what does it mean?


Trying to convert Float8_e4m3fn to the MPS backend but it does not have support for that dtype.

nok
Автор

Please help. Why i do have "Trying to convert Float8_e4m3fn to the MPS backend but it does not have support for that type." window notification? How can I fix it?

roedigoen
Автор

Thanks for your video, that's what i need. Do you know whether mac pro can run the kijia's model for fp8 ? it's only 11GB, but when I run the model, the error reports that MAC MPS cannot support the dtype.

YING
Автор

i always get TypeError: BFloat16 is not supported on MPS error whichever modal i used on my macbook pro m3 max 48gb version . i am using the Simple to use FP8 Checkpoint version of the flux dev

JamesArslan
Автор

Can it work on Mac mini M2 with 16gb ram?

garen
Автор

im having "KSampler - BFloat16 is not supported on MPS", how can i fix that?

nousllc
Автор

I have a mac studio m2 but it's not nearly as fast as yours with flux. What changes did you make for it to be so fast?

antiplouc
Автор

Wow it's slow, i thought 2 min 30 sec on my cheap 680 dollar laptop is slow

Tatar_Piano
Автор

I have M3Max 48Gb - output is very noisy. What am I doing wrong? I'm using flux1-dev-fp8.safetensors

ЦыхраЦыхра