GPU Computing

preview_player
Показать описание
We often hear about people using their graphics hardware to speed up computation. How is this possible? What about a GPU makes it faster than a CPU, and why do we not use it all the time?

= 0612 TV =
0612 TV is your one stop for general geekery! Learn about a variety of technology-related subjects, including Photography, General Computing, Audio/Video Production and Image Manipulation! Enjoy your stay, and don't hesitate to drop me a comment or a personal message to my inbox =) If you like my work, don't forget to subscribe!

-----

Disclaimer: Please note that any information is provided on this channel in good faith, but I cannot guarantee 100% accuracy / correctness on all content. Contributors to this channel are not to be held responsible for any possible outcomes from your use of the information.
Рекомендации по теме
Комментарии
Автор

Great video. I'm studying CPU architecture at school now and it's interesting to see how it could work together with the GPU.

SyntekkTeam
Автор

vertex shader reads vertices in Model Space (in case they are read directly from the object file) and transforms them into Clip Space (not Screen Space)

webgpu
Автор

I'm still confuse about the parallel thing in GPU. I have read a journal saying that sphere and cylinder have high computational parallelism. How will you know if a figure is parallel enough for GPU? Please help. Nice video by the way, earned new subscriber.

monkeith
Автор

Thank you so much for this video! It has been a really helpful starting point for me on this subject.

Can you provide a link to the source from where you got the information regarding the traditional GPGPU approach you mentioned? The approach involving converting a problem to an imaging problem.

AviVajpeyi
Автор

GPU full parallelized already demo in U.S. & Europe with realtime BIG DATA database system handle by few GPU card added

billyheng
Автор

So you could write a game that used extra GPUs to handle things other than graphics without worrying about needing Crossfire or SLI? Have one GPU doing the actual drawing, another calculating the behavior of a bunch of NPCs, and the CPU handling the main mechanics of the game.

johnterpack
Автор

GPU for data processing inspire my idea, why not hybrid CPU&GPU while both processor is GP then such idea might be realistic. It only applicable in the latest supercomputer, NVIDIA launch. Expensive development, unless someone run development center to explore BIG DATA in realtime. In U.S. they've a demo which begin with flat data transfer in parallel. Future, it might be possible to code dataset to image form but still require lots of hard code and apply different technique to achieve 3D dataset in massively parallel dynamic streaming not to forget low level processors inter-communicaton. That will be very identical to BRAIN COMPUTING getting more interesting, by then data transmission will be in flash of ms.

billyheng
Автор

Omg wow Dr Low? Haha i was TA for him for 1 sem and he did tell me he taught computer graphics but wow he is famous

ExDarkx
Автор

I see all the effort you put on making your videos. Have you thought on posting them on websites such as Stack Exchange, in order to increase the amount of views?
You can simply search on the site for people asking questions related to your videos and post a link as an answer (even if the question has already been answered)

Best regards,
Daniyal

dshahrokhian