When aiming for a fully deterministic program, it is common knowledge that you have to deterministically seed your random number generators, be careful about multithreading, and the floating point computations.
In this post I want to highlight a few less commonly mentioned pitfalls I encountered when making my game deterministic.
Foreshadowing
- different compilers and compiler options.
- different standard library implementations (STLs).
- different architectures.
Random numbers
#include <random>
int get_random_number() {
int64_t seed = 9876543210;
std::mt19937 rng(seed); // a generator based on mersenne twister
std::uniform_int_distribution<int> dist(0, 10); // to get integers in [0, 10]
return dist(rng);
}
1. uint_fast32_t
The issue? The std::mt19937 implementation uses uint_fast32_t, which becomes uint32_t or uint64_t, depending on the architecture:
typedef mersenne_twister_engine<uint_fast32_t, 32, 624, 397, 31,
0x9908b0df, 11, 0xffffffff,
7, 0x9d2c5680,
15, 0xefc60000,
18, 1812433253> mt19937;
The fix was replacing the uint_fast32_t with an uint32_t.
In general, if you are targeting 32 bit platforms, beware of uint_fast32_t!
2. Standard Library Distributions
You trace it back to STL's std::uniform_int_distribution. This makes sense! There are multiple ways of generating a uniform distribution from a random number generator.
To be precise, from what I've seen the STLs all use the same algorithm, but the exact implementations can and do vary.
In my case I discovered the problem when porting the game to Linux. I solved the problem by using my own distribution implementation.
Sorting
Order of evaluation of parameters
Say you want to generate a random coordinate in a 10x10 square and you write the following code:
You may get {2,8} on linux, but {8,2} on Windows!
Pointers
Enforcing limits on memory usage
In my game I've restricted the memory consumption of the Lua interpreter: the interpreter returns an error when a user-provided script uses too much memory.
I would like this error to occur deterministically.
Unfortunately that is not possible because the memory usage of the Lua interpreter is not deterministic:
A given structure will not be the same size on every platform (e.g. because the pointer size is different, because the alignment requirements are different, because the STL is different, because of a compiler flag, etc...).
The lesson here is that if you want to deterministically enforces limits on memory consumption, you need to work at a higher level of abstraction than just the allocation size.
Deterministically limiting memory usage is a niche requirement though and I am not aware of any library that does that, let alone any interpreter.
More blog posts on determinism
- Gafferongames [2010]
- Explains why having deterministic floating point calculation are non trivial.
- Floating point determinism (Bruce Dawson) [2013]
- Goes into a lot of details as to why deterministic floating point can be hard, including how having more precision than needed is a problem.
- Riot Games (League of Legends) [2017]
- Covers making an existing game deterministic.
- Touches upon uninitialized variables and non-deterministic pointer values.
- Box2D [2024]
- Explains under which conditions he's been able to get deterministic floating point calculations.
- Explains that Box2D does not support rollbacking. Fortunately we've seen a technique to rollback non-rollbackable libraries.
No comments:
Post a Comment