NVIDIA GeForce RTX 3090 vs 3080 vs 3070 vs 3060Ti for Machine Learning

preview_player
Показать описание
In this video we look at a high-end machine learning workstation provided by Exxact corporation. I will feature this machine in several videos, any suggestions on what to try? In this video we review the system specs and perform a simple test of the two 3090 GPUs.

0:36 Exxact System
1:07 NVIDIA GeForce RTX 30xx Overview
2:48 StyleGAN2 ADA PyTorch Benchmark
8:28 3090 NVLink

Exact system specs:

** System Used **

* TRX40 Motherboard
* Threadripper 3960x
* 128GB Memory (16GBx8)
* 2x 4TB PCIe 4.0 NVME
* 2x NVIDIA GeForce RTX 3090 | 3080 | 3070
* NVLINK Bridge w/3090

For more information about the machine featured in this video, please visit:

0:56 Exxact Corporation
1:29 System Specs
3:01 Starting the System
3:51 SSH Tunnel
4:07 Monitoring GPU Performance
4:51 NVIDIA GeForce RTX 3090
7:00 Detect GPU
7:30 Test Multiple GPUs

Jupyter remote through SSH tunneling:
Рекомендации по теме
Комментарии
Автор

Thank you for your videos! I bought 3090 for my startup - we are working on computer vision (obj det, depth estimation), it's works almost x3 faster on mobiledet detector compared to my old 3070 because of the batch size. With the same batch size 3090=2*3070 in speed comparison.

valdisgerasymiak
Автор

Hey Dr. Heaton thank you for this much requested video, keep up with the awesome work, you are an inspiration to many in the deep learning community :)

adityay
Автор

I am currently training StyleGAN2-ADA with RTX3060, your previous videos helped me a lot, 12 GB VRAM is good to have. Can you include RTX3060 in next video?

travelthetropics
Автор

Great video, I’m building a 3090 machine for ML, so hoping more of your content (like this one) can address the hobbyists/ autodidact ML folks like me.

csqr
Автор

After upgrading from 3090TI to 4090, there is no training time difference between them when I trained InceptionResNet-V2 on 8 classes classification work. But there is a significant improvement of training time between 3090TI and 3090.

oscarjeong
Автор

Thanks, this is an excellent review. I have some notes on 3060 for those who are considering it. I ended up getting a new 3060 12gb in 2023. I used Kaggle's free tier a lot (P100 16gb). 3060's training times are similar to Kaggle free tier for the same batch sizes. 12gb works really fine for me. It's a good deal for a low cost gpu.

buildwithcuriosity
Автор

Thank you for providing this valuable content to the community!
I'm a data scientist with strong background in Bayesian statistics, looking to get into deep learning.
I'm building a workstation with 5950X + 128GB RAM to learn that by doing various projects.
Should I invest into 3090 to future-proof myself, or you think 3080ti (12GB) will be a safe choice for the next ~2 years? I want to learn as much as possible during that period to switch my career a bit.

muontau
Автор

great vid Prof! thanks so much 4 being U!! love this stuff!! very minimal !! easy 2 get up, and running!

__--JY-Moe--__
Автор

Hello Jeff: thank you very much for your interesting contents, that I follow since the pre-2012's days: by then NN's in Java was a huge part of our common interests. Kind regards from Portugal.

luisluiscunha
Автор

Would love to see this up payed for the rtx4000 Series

ToddWBucy-lfyz
Автор

Hey Jeff

Can you suggest what kind of specs (storage, chipset, memory, cooling) one should go for while building his/her first single GPU workstation? I am fixated at getting a RTX3090 (given 24GB RAM). What kind of nominal specs should I go for given a limited budget? Maybe you make a video on it :)

aayushgarg
Автор

Thank you! looking at two 3060 ti 8gb, cost the same as a single 3080. so grateful for this video. buying the 3080!

inavoid
Автор

Great vid thanks!! You recommend the 3090 for inference? Looking for a Gpu for large NLP models like GPT-J

mchel
Автор

Hi Dr. Heaton, I really enjoy your informative videos on youtube, I was wondering if you could help me. I am a medical doctor and have just begun with ML (currently doing a doctoral thesis in this field) and planning to build my own build, should I go for 12 GB 3060 or 12 GB 3080, and later in my career add a 24 GB 3090 (or 4090!) to my build, or just buy 3090 now?

mahdimoosavi
Автор

Have an HP Omen Laptop with RTX 3080 16 GB, 32 GB RAM and an i7 11th gen.
Bought this laptop specifically for my experiments with transformers

uatiger
Автор

Thanks for the video. I bought a 3090 for gaming (because I couldn't get any other cards) and am looking into taking ML courses.

alanorther
Автор

Curious how the 40 series stacks up and how they all compare to 2080ti (what I have now)

Ultrajamz
Автор

Hi Jeff ! I just have one question, If I have multiple rtx 3070 for example, those VRAM can be summed ? So 4 cards of 8gb = 32gb VRAM ? Thanks for you amazing content !

samuelparada
Автор

Although gddr6x is faster 1:1, when working with large projects such as in dynamic analysis or large scale raytracing, it can be better to have “more” gddr6 in a card such as the rtx a6000

deepfriedavocado
Автор

It would be really interesting to see how video memory limitations work with M1/M1Pro/M1Max machines, they can technically allocate all 16-64GBs to Vram. Interesting to see how far one could push batch sizes :)

papsaebus