Wow, NVIDIA’s Rendering, But 10X Faster!

preview_player
Показать описание

📝 The paper "3D Gaussian Splatting for Real-Time Radiance Field Rendering" is available here:

Community showcase links:

My latest paper on simulations that look almost like reality is available for free here:

Or this is the orig. Nature Physics link with clickable citations:

🙏 We would like to thank our generous Patreon supporters who make Two Minute Papers possible:
Aleksandr Mashrabov, Alex Balfanz, Alex Haro, Andrew Melnychuk, Benji Rabhan, Bret Brizzee, Bryan Learn, B Shang, Christian Ahlin, Geronimo Moralez, Gordon Child, Jace O'Brien, Jack Lukic, John Le, Kenneth Davis, Klaus Busse, Kyle Davis, Lukas Biewald, Martin, Matthew Valle, Michael Albrecht, Michael Tedder, Nikhil Velpanur, Owen Campbell-Moore, Owen Skarpness, Rajarshi Nigam, Ramsey Elbasheer, Richard Sundvall, Steef, Taras Bobrovytsky, Ted Johnson, Thomas Krcmar, Timothy Sum Hon Mun, Torsten Reil, Tybie Fitzhugh, Ueli Gallizzi.

Károly Zsolnai-Fehér's links:
Рекомендации по теме
Комментарии
Автор

This paper is the actual goat not just the goat

zyxyuv
Автор

I really love when the most amazing papers don't use neural networks

cocccix
Автор

This is why explainability in AI is so important. Let AI do the tedious exploration of solution space, it explains what it's doing, and then we can work with what it found and modify the solution however we want.

darrylkid
Автор

Can't believe it has no neural components! Just proof that we humans still have the advantage, but one day that may change 😅

ENDESGA
Автор

Would love to experience this level of detail in VR.

Something
Автор

I love your use of the phrase "just one more paper down the line". So true.

mikehibbett
Автор

this paper is just lit. imagine it being used in future games. crazy times ahead everyone.

hemant
Автор

This is even more amazing than the average mind blowing break throughs on this chanbel.

Wobbotherd
Автор

Those thin structures were impossible to even scan a few years ago. What a time to be alive and be able to witness the advances that we achieve just by pushing further.

MrtnX
Автор

🎯 Key Takeaways for quick navigation:

00:53 🏞️ New technique for real-time rendering of virtual worlds promises over 10x faster rendering than previous methods, addressing thin structure challenges.
01:52 🎮 The new technique offers both faster rendering and higher quality results compared to NVIDIA's Instant NERF technique, surprising with its superior performance.
02:49 🖥️ This breakthrough method is not a NERF variant and doesn't rely on neural networks; it's a handcrafted computer graphics technique with innovative ideas.
03:14 🌊 The technique uses a 3D Gaussian splatting approach to represent objects as a sum of waves, enabling efficient rendering on 2D screens while conserving computation around solid objects.
04:40 🎨 The algorithm focuses on scene primitives rather than pixels, drawing from a long-standing concept in computer graphics, resulting in a fast yet high-quality rendering solution.

titusfx
Автор

It’s pretty impressive that we get better speed without sacrificing quality. Amazing work!

bashergamer_
Автор

"quite a bit of memory" is a nice and understandable value

sliter
Автор

This is actually really impressive, also glad to see how much potential there still is to be gained from non-AI research. I feel like most recent papers were largely based on machine learning.

Metazolid
Автор

Sublime! The amount of progress that's been made in just a year or two feels _unreal._ Looks like the Singularity is right on-schedule!

GL-GildedLining
Автор

Love the endless advancement, hope this new technique can output 3D mesh data

adamfilipowicz
Автор

5:10 can't that be fixed somehow by combining it with ray tracing as like the next step of rendering a scene?

It's incredible. I never thought things would advance so quickly, I'm starting to believe we might see actual AR and VR games before the end of the decade even.

JealusJelly
Автор

I was impressed a year ago and now wow

GoldBearanimationsYT
Автор

Does anybody else think that the Professor used an AI voice clone to do the sponsor segment? To me it seemed less express or and I think there were some audible artifacts.

justluke
Автор

This is ground breaking! This is what I subbed to this channel for.

shApYT
Автор

No longer "It's NeRF or nothing" 😮‍💨

JorgetePanete