How to Run LLM Locally on Your Mac

preview_player
Показать описание
This is your ultimate guide to running large language models locally, on your Mac. In this video we cover three products, from easy to more difficult and each of them can enable you to run large language models on your own laptop or computer.

We cover how to set them up and use them, and how to interact with the APIs for building your own AI applications.

---
Рекомендации по теме
Комментарии
Автор

Hi, very nice video! Thank you! So do I understand that I wont be able to use LMstudio option with the following Mac: MacBook Pro 13.3 i5 1.4GHz/16GB/256GB SSD/INT/macOS ?

monikawodarczyk-dyrga
Автор

Hi Jeremy, love your content! I know its not a Mac but, I have an Alienware m18 R2 with an Intel i9-14900HX, NVIDIA RTX 4090 (24GB), 64GB RAM, and 8TB storage, but I struggle to run LLaMA 70B models. Could you create a video for users like me on optimizing setups (8-bit quantization, mixed precision, etc.) to run large models efficiently? Your help would be greatly appreciated. Many Thanks!

theuniversityofthemind