Mistral 7B LLM AI Leaderboard: GPU Contender Nvidia RTX 4060Ti 16GB

preview_player
Показать описание
Mistral 7B LLM AI Leaderboard: GPU Contender Nvidia RTX 4060Ti 16GB

This week in the RoboTF lab:
The standard card on the channel an Nvidia RTX 4060Ti 16GB gets put through the Mistral 7B leaderboard gauntlet.

Final results at 12 Min Mark.

Just a fun evening in the lab, grab your favorite relaxation method and join in.

GPU Bench Node (These are affiliate-based links that help the channel if you purchase from them!)

Recorded and best viewed in 4K
Your results may vary due to hardware, software, model used, context size, weather, wallet, and more!
Рекомендации по теме
Комментарии
Автор

I'm interested in more tests on how two or four 4060ti comparing against a single more expensive consumer card like 3090 or 4090. The benefits of larger total VRAM 2x16GB or 4x16GB against 1x24GB. Also, the graphs not showing CPU usage, is there any bottleneck from CPU? Are there benefits from more cores, or single threaded performance is crucial?

moozoo
Автор

I think 4060 ti 16 gb is an excellent gpu for LLM price, performance, efficiency & vram and all your 4060ti videos confirms it.
For the c urrent price of 1, 4090 you can get upto 3 to 5, 4060ti, i.e, a lot more VRAM, which means much more capabilities, load bigger models, even if evaluation is a bit slow.
Obviously you will need workstation CPU like threadripper or xeon to get so many pcie lanes.

The only downside of 4060 is its bandwidth, which gets cut down to x4 pcie, if used with PCIiE gen 3 system as it is a pcie gen 4 x8 card.

abhiabzs
Автор

mates I have a question. currently I have a 3060 ti 8gb and I want to pair it with a 4060 ti 16gb for deep learning. do you thinks it is a good chois to go with a 3060ti 8gb and a 4060 ti 12gb for deep learning?

RaadLearn
Автор

Men in Black: Noisy cricket. I've got this card and I think it was a good first buy. Should I get a 4090 or wait for the 5090 for inference work?

shawnvines