Noise-Based RNG

preview_player
Показать описание
In this 2017 GDC Math for Game Programmers talk, SMU Guildhall's Squirrel Eiserloh discuss RNGs vs. noise functions, and shows how the latter can replace the former in your math library and provide many other benefits (unordered access, better reseeding, record/playback, network loss tolerance, lock-free parallelization, etc.) while being smaller, faster, and easier to use.

GDC talks cover a range of developmental topics including game design, programming, audio, visual arts, business management, production, online games, and much more. We post a fresh GDC video every day. Subscribe to the channel to stay on top of regular updates, and check out GDC Vault for thousands of more in-depth talks from our archives.
Рекомендации по теме
Комментарии
Автор

These are the videos I'm here for.

dotaportalvideo
Автор

Some super useful information here. Two pain points though: Perlin isn't good default to keep promoting for noise due to squareness, and Simplex/Perlin != Fractal noise.

Squareness, or more generally visual anisotropy, runs principally counter to the nature-emulating goals of noise. Noise should look, to a reasonable degree, probabilistically the same in all directions. Simplex(-type) noise is better about that and, while mentioned in the talk, it was given an unfair backseat position.

Fractal noise ("fractal Brownian motion" / fBm) is the process of adding together multiple layers of some base noise to produce the so-termed "crunchy" result. It's incredibly useful, but also worthwhile to understand as separate from any individual noise function. That way we can better convey the versatility of noise, and avoid invoking the name of a specific algorithm when we may not need to. Hash (integer "Noise") -> Smooth Noise (e.g. Simplex) -> Fractal Noise (+ many other formula options!), each a building block of the next.

Once you move past this, this talk does a great job at covering important fundamentals.

kjpg
Автор

Squirrel Eiserloh was my game programming professor at SMU Guildhall. He was the best teacher I ever had. The most amazing person, who explained complex concepts in such a cool, interesting and easy-to-understand manner. Everyone has their No. 1 teacher that they remember fondly for the rest of their lives. For me, that teacher is Squirrel Eiserloh.

KKomalShashank
Автор

Squirrel3 is at 44:34. And Squirrel3 with seed is at 51:52

ciberman
Автор

Thanks for the presentation. 😀

00:00:00 Intro
00:02:00 Who is Squirrel Eiserloh?
00:02:30 What will we be talking about?
00:06:00 Talk Overview
00:06:30 What do we want in an RNG?
00:11:56 What RNGs should we consider?
00:19:20 Limitations of traditional RNGs
00:26:45 Noise functions
00:36:00 RNG-based Noise
00:46:30 Seeding Noise functions
00:47:35 Multidimensional Noise functions
00:50:18 Noise and RNG Takeaways

juanchis.investigadorsonoro
Автор

Perfect timing~
That was a very impressive and detailed talk.
I find it quite fascinating how RNG can be influenced, kinda predicted and willingly manipulated.

yivo
Автор

This was a huge help to me as a game/engine programmer and answered a lot of questions I had about this subject without spending many long months going down the rabbit hole of researching and implementing these things and then exhaustively unit testing them

GameDevNerd
Автор

This makes the Factorio world generator make A LOT more sense (among other things). Thanks so much for this.

Kindred
Автор

I don't think this should be called noise, since that term is usually attributed to "organized noise" algorithms such as simplex, voronoi, etc. His algorithm is more like a stateless RNG to me. As you can use his stateless RNG to generate simplex noise, etc. His argument, to me, boils down to stateless RNG is better than stateful RNG.

bonehelm
Автор

I'm shaking, this is crazy good, thanks Squirrel!

VladyVeselinov
Автор

I wanted to transition from generating all data and saving and loading it from a database to using procedural generation. I had to work on the ideal RNG and I created a tool to create many iterations of RNGs with different parameters. They were a mix of bitshift, XOR, XAND and algebra operations (it also had +/-/*) from which I created random and varying amounts, but also varying amounts of bitshift-masks generated from its own RNG, and evaluated them at the end. So I created samples, each contained 100 RNG with varying seeds (0 to 999, or randomly picked, or in steps of ~1000, etc). But I had a special criteria, besides the usual - the FIRST result of each seed had to be measured too.

I created a normal version of it, a light version, and I created a BitScrambler - which had the purpose of taking in coordinates (x, y, z) and creating differing numbers. It is like a RNG, but always starting from its zero-state (it only takes the input into account). Imagine that the input 0, 0, 0; 0, 0, 1; 0, 1, 0; 1, 0, 0 all had to create different numbers (for seeds), and it might go up to thousands or millions. At the end I did a tiny twist and it improved the output by at least a factor of million. Some generated seeds were still increments to each other. Each identical starting seed would mean that a certain coordinate was generating the SAME content as another one.

In my early tests with my procedural galaxy generation I had very weird patterns, sometimes even grid-like structures. Now, as I am finished with all this, my galaxy looks exactly the way I want. Btw, storing 10.000.000 solar systems with all planet and moon data required 5gb. Now I can not only have galaxies of basically any size, but also any amount of galaxies. My system for positions is precise up to meters or even better and can map the entire observable universe.

brianviktor
Автор

Just watched this for the first time. A notable takeaway from this talk for me is that the random number generation aspect should be separated from the visual, organic aspect ("smooth noise.") We could and maybe should be comparing the quality and efficiency of noise smoothing algorithms in the same way that Squirrel compares the grade of the randomness of the numbers. And the two aspects should be separate, or at least clearly apparent and changeable in the source for those smooth noise generators.

jameshood
Автор

It would be awesome if some math or CS savant would put up a table of large prime numbers with non-boring bits. Preferably, labeled as “good for 16/32/64 bits.”

ColinPaddock
Автор

46:33 is the algorithm, also at 51:59

daveyhu
Автор

This is very interesting, but I do want to point something out: the `std::hash` function is implementation-defined, and identity is completely compatible with the spec. I donʼt know if any implementations actually use it.

danielrhouck
Автор

So… what are the licensing rules for using the squirrel hashes?

ColinPaddock
Автор

44:32 He states "These are prime numbers", but none of them are. So what is the reason for this particular selection? As they're part of a larger set of operations of multiplication and bit shifting, is the purpose just to have "interesting bits"? If so, he just misspoke/misremembered, but then - what is considered "interesting bits"?
And if not, then there's a pretty big issue with this code, for none of these numbers are prime numbers..

StianF
Автор

52:00 not sure what the mistake is, but a prime should cleary end with a 1 in binary - BIT_NOISE1 does not - so its not actually a prime number - but like I heard from the talk, it is supposed to be at least.

juyas
Автор

good presentation - well thought out arguments for using noise

skyblazeeterno
Автор

If you parameterize the noise function by a seed, does that statistics suite account for different seeds? For example a noise function may work great at seed 0 but be highly correlated and fail tests at seed 255.

vkessel