Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Higher Dimensional Morton Encodings #34

Open
Microno95 opened this issue Jan 10, 2019 · 1 comment
Open

Higher Dimensional Morton Encodings #34

Microno95 opened this issue Jan 10, 2019 · 1 comment

Comments

@Microno95
Copy link

(Thought it would be more appropriate to start a new thread to keep the topics separate.)

I noticed that you were seeking a possible alternative to Morton encoding that is applicable to higher dimensions.

I remember reading up on these when I wrote a Barnes-Hut code myself and it seemed that the biggest issue with Morton encoding is that with higher dimensions fewer significant bits are incorporated into the encoding.

Could this be solved by using a Hilbert curve-based encoding or is this a mathematical issue that just requires more bits for the encoded value?

If it's the latter, could packed floats/doubles like SIMD vectors or std::bitset be used to effectively increase the available bits for an encoding?

@bluescarni
Copy link
Owner

It is a combinatorial issue. The number of bits you need to create a Morton code increases linearly with the dimensionality of the space. The Hilbert curve behaves the same in this regard.

The advantage of the Hilbert curve is that it does not "jump" around, whereas the Z-shaped curve of the Morton codes will sometimes jump in a discontinuous fashion. In other words, N successive Hilbert codes will generally span a region in space smaller than N successive Morton codes, so in this sense the Hilbert encoding does a better job at preserving locality of space.

The disadvantage of the Hilbert codes is that they are much more expensive to compute. I actually had a branch in rakau at one point that was using Hilbert codes:

https://github.com/bluescarni/rakau/tree/pr/ph

So, for the purpose of N-body simulations and BH trees, a Hilbert curve can potentially make the tree traversal faster, but the (re)creation of the tree will be more expensive. It's not clear to me yet how these two aspects balance out in typical use cases; for the time being I decided to keep the Morton codes as I find them generally easier to work with, but in the future I have not ruled out switching to Hilbert codes. Perhaps this could even be made a configurable parameter of the tree class.

In 3D, with 64bit morton codes, you have a spatial resolution in each dimension of 2**-21. This means that, in a cubic domain of edge 1AU, the smallest possible tree node has an edge of circa 71km, if my math is not off. This resolution has been enough for me so far, but if in the future it turns out it isn't we could switch to 128bit Morton codes rather easily (on the CPU, at least). This would give a minimum tree node size of about 3.4 cm.

(Note that rakau will work just fine when reaching the smallest tree node. In such case, the setting about the minimum number of bodies in leaf nodes will just be ignored, and the only consequence could be a degradation of performance)

Finally, there is actually an N-dimensional Morton library available on github here:

https://github.com/kevinhartman/morton-nd

The problem I had with it is that it noticeably increased compile times, so I decided to stick with the current solution based on libmorton at this time. If we really need N-dimensional trees, we can always switch over.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants