Blender 2.7 Tutorial #70: Speeding Up Cycles & GPU Rendering with CUDA #b3d

preview_player
Показать описание
Thanks for watching! Please don't forget to subscribe to this channel for more Blender & technology tutorials like this one! :)

In this Blender 2.7 Tutorial #70 I cover:

- How to Speed up rendering on your computer, whether you use your CPU (processor) as your render device, or your NVIDIA Video Card.
- Settings including the number of samples, light bounces, tile size and tile order.
-How to best utilize your nvidia geforce gtx video card with its proprietary CUDA technology in Blender to greatly speed up your renders.

****************

*****************

*****************

Thanks for watching, and don't forget to Like & Subscribe to help the channel! =)

**********************************

Visit my Blender 2.7 Tutorial Series playlist for more Blender Tutorials:

Also check out my Blender Game Engine Basics Series playlist:

My Blender Video Effects Playlist:

Рекомендации по теме
Комментарии
Автор

Your series is still my favourite intro to Blender, really.
Thanks for investing your time, your talent and knowledge. I appreciate this a lot!

CastalianVisions
Автор

Don't listen to all the nay sayers the Audio Tone, Volume etc. were as usual spot on, just like all the others. I also really like the way you instruct, if your not a professional Teacher it would suit you. I especially like how you get into the nuances of the areas that you are speaking of. It is a much easier way for me to learn when it comes in small extensive bites like you teach than just an overall generalization of everything, which is helpful for a walk through but not for deeper learning.

In other words I likie the hand holding while I learn :P Thanks!

Synapticsnap
Автор

Don't forget the Auto Tile Size add-on (included with blender) when you're using the GPU. It selects the best tile size automatically for you, taking into account the output resolution of your image.

rustycwright
Автор

WOW! Thanks so much! I had my square size on 64 and my brand new 1080ti was not going nearly as fast as I expected. Increased the size to 256 and my render time for a scene went from 100 seconds to 30!!!

jousboxx
Автор

downloaded the blender file and did a cpu render test on my stock ryzen 1700. Took 1 minute and 59 second. Well worth the 260 bucks I paid for this cpu a while back.

davidg
Автор

GPUs cannot do multiple tasks at the same time? That's the point of using GPU. They can do waaay more things in parallel than CPUs. Only catch is that GPUs can run only one piece of code (in case of rendering software called render kernel). If you want to run one piece of code, but on different data, GPU can help. That''s why you use bigger tile size. You want to render as many pixels at once, as possible (of course GPUs have limited memory, and there is limit of how much data they can process at once).
On the other hand, CPUs can do only few things at time, but they can do it faster (if you count single thread performance; GPUs render faster in Blender because of parallelism). Reason why you set small tile size is that you want small enough chunk of data so it can fit into cache (there are several levels of cache between processor and RAM, and they are much much faster than ram).

peto
Автор

thank you for sharing. i appreciate you sharing your knowledge with us. i look forward to utilizing your tutorials to further my understanding of this software, and get some well needed practice in.

charlesnorman
Автор

Your scene can make a huge difference too though. Like if I render the simple starting cube, on CPU it renders in 0.24 seconds. On GPU in 20.89 seconds. But I import a scene with buildings, and a character, and on CPU takes 5.03 minutes to render, but GPU takes 2.17 minutes. It seems like the GPU renders scenes with a bunch of things in them a lot faster than it does with almost nothing in it.

SpeckyNation
Автор

Humor me and use your GPU on this same scene and set your tile size to 270x360. I bet it will be even quicker. I have figured something out that works for CPUs and GPUs on every machine I have ever tried. All the rendering tiles need to be the same size. You might think they are but the mere fact 64x64 or 16x16 or even 512x512 does not perfectly fit the frame size, means that at the edges, there will be partial tile sizes.

If the width and height of the tiles are evenly divisible into the total frame size and as close to square as possible, it will shave a few more seconds off.

BlenderRookie
Автор

I could NOT figure out why my GPU (GTX 980ti) was way slower than my CPU, but now I know, Thanks!!

ggessex
Автор

There's an addon (supplied with the current release at least) that will set the tile size automatically to the most efficient dimensions based on your CPU, GPU and resolution settings. Pretty neat for the clinically lazy like me.

marcdraco
Автор

i used my GPU with 50 Samples and 512 Tiles on 720p 24fps
without these setting one frame took me around 2 minutes
now just 10 seconds :)

nathaos
Автор

Very common stuff but interesting is that I will render the scene during 36.5s on 780GTX (with these final settings for sure) so looks like 970 is MUCH slower no matter its MUCH newer... Yep, I like my card even more now ;)

janmatys
Автор

hey! I tried to use cyclic render but after animating it, it shows blank video...

CosmosBFN
Автор

First render 21min 50 sec. iMac early 2008, 2.4 Ghz Intel Core 2 Duo, 4 GB 667 Mhz SDRAM, ATI Radeon HD 2400 Xt.
I think it's important to know the Blender version to the day, because it speeds up quite a bit month to month.
I used 2.78.3 11-15-2016.

onjofilms
Автор

so... is this tutorial for more realistic and faster renders?

Mucdaba
Автор

For some reason, when I render with my GPU, my volumetric lighting disappears

FragaracGaming
Автор

Hey there great vid. i really appreciate the time you take to make these tutes. i came across this as im looking to build a PC desktop for blender and im struggling to find info on the best combinations for multiple GPU set ups. iv been looking at blenchmark.com and i had my eye on the GTX 1070 and getting three or four of them but i cant see any bench mark test for x3 or x4, which leads me to be confused. is this combination possible? what are some other good multiple GPU combinations for blender? is it difficult to set up multiple GPU's or is it as simple as just installing the drivers and following the directions you gave for GPU rendering? If you are able to reply - thank you in advance!

clientliaisonmusic
Автор

Have you tried using empties? Just wondering how well they work on systems more powerful than my poor poor laptop.

redrune
Автор

I have Geforce GTX 1050 TI. But it doesn’t show my card, when I switch to CUDA or OpenCL. Can anyone help?

gamechallenge