Skip to content

Appending to an existing dataset #592

Answered by 1uc
njchim asked this question in Questions
Discussion options

You must be logged in to vote

There's the following example which I hope will help you or others:
https://github.com/BlueBrain/HighFive/blob/master/src/examples/create_extensible_dataset.cpp

Please let us know if there's something missing from the example.

Chunking gives HDF5 the permission to store the array in chunks, e.g. elements 0, ..., 15, 16, .., 31, etc. will be stored in contiguous chunks. (The chunk size is configurable; and 32 is likely not a good choice, too small.) Only once the need to store the entire array contiguously is lifted, can one imagine increasing the size of the array on disk. Conceptually it might be something like: add new chunks to the end of the file and update some internal datastructure…

Replies: 1 comment

Comment options

You must be logged in to vote
0 replies
Answer selected by ohm314
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
2 participants