Watch this BEFORE buying a LAPTOP for Machine Learning and AI 🦾

preview_player
Показать описание
Machine learning on a laptop, is that even possible? How about Macbooks?
What hardware do I need? What should I spend? What do I need to focus on?

Here's the follow-up on how to train machine learning models in the cloud for free:

This video discusses the original M1. However, the same logic applies to the Apple M2 Pro and M2 Max, they're just ✨even better✨.

——————————————————————————————————————
Join the community and support these videos:
——————————————————————————————————————

——————————————————————————————————————
🎥 My Camera Gear
🎼 My Music
——————————————————————————————————————

⏱️ Timestamps
00:00 Intro
00:29 Training with a Laptop
00:37 Difference Desktop and Laptop
01:23 The Apple Ecosystem
03:33 Do you even GPU, bro?
04:30 Everything you need to understand about computer hardware
08:49 What type of GPU you need
09:27 Should you even do Deep Learning on a Laptop
11:15 What to prioritize in your Laptop Hardware
12:25 Making it work with a small-ish Laptop
13:02 With a GPU you can try Nvidia RAPIDS cuML
13:47 What else is there to consider?
16:00 So can you train machine learning on your laptop?
17:01 My Recommendation
18:00 Byeeee

——————————————————————————————————————
👋 Social

📝 Disclaimer

Opinions my own. Not financial advice. Sponsors are acknowledged. For entertainment purposes only.
Рекомендации по теме
Комментарии
Автор

Here's how you train a model in the cloud for free:

JesperDramsch
Автор

You are so very correct. Especially for newer AI developers, long training times are not the norm. We use RTX through H100’s for most of our AI development— at least on the training side. However for coding, data sci work, inference and UI/UX we all use our favorite OS, whichever that is. One thing to keep in mind for pro level large parameter/data set AI dev, you will often be using a dedicated server running in the kilowatts with AI grade TPU/GPU’s (e.g. V100’S, H100’s, etc). Whether owned, hosted or otherwise, few jobs will be run locally.

amdenis
Автор

Man, this is so so helpful :) Many thanks for patiently covering all the key concepts !

TheChanjoo
Автор

This channel needs way more subs! The content is high quality / well explained. :)

truthmatters
Автор

I got myself an m1 air a couple of months ago. One thing I dislike is that tensorflow has multiple issues with Mac. It's better to learn about scaling and deploying first, because clouds are always available, rather than throwing a large amount. As for whether it's worth it when you're very advanced in the field, I'll update when I get there 😂

Side note I have a 3070 but I realised model design, preprocessing plays more of a part in ML.

waynelau
Автор

I understand your point but I don't fully agree about your sentence when you say (with my words) "a CPU with just a load of RAM will be enough"... I'll explain why:

Though you are right saying we have to prioritize RAM, but CPU is important too...try training a model with Weka workbench (java based) on you laptop or desktop computer... a fast CPU will help.

Students will do deep learning and not necessarily limit themselves in machine learning with scikit or whatever framework. so...

a) Having a lot of RAM yes but with a very good CPU too... most probably when working with MLmodels is because you are probably working on an application that requires many components where all are not necessarilly ML based. You could design a NodeJS driven UI that will interact with some back end that you still develop onto your computer and that will serve the model.
In order to make it in a very efficient and organised way, you will endup with containers and there is why you'll need CPU and RAM (though they are lightweight).

b) because of (a) you will probably start diving in both DEVOPS techniques and MLOPS paradigm. Both of them will require automation which will also consume CPU. Especially if you build a C++ or Java application that must be built.

c) because of (a) and (b) your computer will start to gain some load just to work all these things.

d) Though an NVIDIA RTX is quite expensive, it can help you a lot on doing deep learning tasks and allow TF to use the onboard GPU. there you will face interresting issues. You'll probably hit during training the VRAM limits and will have to work hard but learn in order to get a really good neural network architecture running on your machine.

e) You talk about using cloud, yes I agree partly, this is only for experienced people. Other will get hard time to make it work (I am not talking about Google colab or any other fantasy stuff).
Therefore you will travel from (a) to (d) on your local machine.

Personally I follow you and agree on what you say about using open sytems and not using macs. 2 years ago I bought a Linux Laptop with an onboard NVIDIA RTX. Because of the budget I could only afford an RTX3600. But I could have a very good intel i7 16 vcpus and 32 GBRAM. All that for less than 1800€ with a wide screen (17"). But that was 2 years ago. Today I would go for a more robust RTX card and smash 64GB or 128GB RAM directly.

The only thing I think I would recommend in that... is the battery, choose good ones and chose a laptop with spare batteries. Also, because you will work with Docker containers and perhaps have many versions of the virtual environments in Python... think about the disk space => I recommend today MINIMUM 2TB of SSD. If you can afford more, better it is.

Then yes using a cloud solution is also elegant but you'll still need to consider an efficient laptop too because of (a) to (e)..

alexandrevalente
Автор

Already owned a Acer Nitro 5 with RTX 3070 mobile + R7 5800H. Still watch you full video :). And my laptop can train 90% types of model after I cramp up the virtual memory => 80 GB (from 16GB of RAM 😂). I'm very satisfied with my $1500 laptop

tutan
Автор

Thank you
You just saved me from getting broke 😅

I was thinking of getting a pc but confused on what graphics card to get

slliks
Автор

This was the first video of yours I ever watched and when I started I thought, naaah a new MacBook Pro could surely be fine for training models. I can't tell you how wrong I was. The hype is very different from reality and you are 100% correct. I have had to embed so many special cases into my training pipeline to support MPS (METAL) and even then support for torchvision is still incomplete in V2. I ended up going for an rtx4090 on a separate headless Linux server and it reduced training time on my use cases by an order of magnitude.

joecincotta
Автор

Great advice. Because I've been looking for a second machine for my deep learning research. Now, I will switch my strategy from a local machine to the cloud. Thanks.

kingfukj
Автор

This is a really good video! i stumbled here and got close to buying NPU mini PC. However, I think I am okay running Gemma 2 Ollama and learning. Thank. Subscribed!

prlgix
Автор

The M1's run very cool and you don't need to worry about running long tasks even in the fan-less one. All 1st gen M1's have a maximum of 16GB of (V)RAM(including the iMac), M1 Ultra has a maximum of 128GB of (V)RAM.
Unified Memory also removes the PCI bottleneck between VRAM and RAM, with a bandwidth of up to a combined 0.8TB/s.

yagoa
Автор

For PhD students, maybe it's better to access a University PC from your MacBook from home :D

thiagocavalcante
Автор

Just discovered your channel and I'm already a fan of you.
I have recently started learning data Science, currently learning python and statistics.
Looking forward to your guidance through this channel.

AamirSiddiquiCR
Автор

ur right my laptop is really aerodynamic... i find myself playing frisbee with it all the time

jonconnor
Автор

We're due for an updated video after the m1x Mac

NiiAnikin
Автор

Hi!
I am also starting to to learn AI and ML now.
Can you please help me with a few things.
1) After what amount of time will I need a better laptop or can I do it on my current laptop? Right now I have an office laptop with intel i5 10gen U Processor with integrated graphics
2) Since I am starting to learn where should I start for AI & ML?
3) Is Asus ROG Flow X13 2023 a good option? It has Ryzen 9 7940HS and Nvidea RTX 4050 6GB (60W). I want this one because it is super portable and would also help in taking notes since it is touchscreen.

Also is 16GB RAM enough in the laptop?

It would be great if you could help me out a bit.
Thanks!

aquaRuHoshino
Автор

Thank you for this. I'm interested in ML for advancement in my career and your explanation is helpful.

animegod
Автор

Loved this video, Jesper. Thank you!!! I’m happy I came across your channels

rebecasarai
Автор

This was really cool. I bought old server GPU and put it into a big desktop PC Case with a hydroponics fan to cool it down. Got a 24gb tesla card for usd$300 that is about as fast as a 1070ti

joecincotta