You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
When a user inserts a new element into a Sparse Merkle Tree, if that tree has a conflict, this operation becomes two inserts (the old element has to be pushed down a level, and the new element has to be inserted).
This causes hashing of the complete merkle path twice, once for each update. It may make sense to batch operations and remove the extra unnecessary work (in the above example, that would be the hashing from the common parent up to the root)
The text was updated successfully, but these errors were encountered:
I wonder if it would make sense to expose batch operations in the VM level too. The example on the issue is about an insert from the user perspective that becomes two inserts at the tree implementation.
However, this could be a bit more generalized, for instance, a user may want to construct a tree in the VM, currently this has to be done in an assembly loop, and the user will gradually modify the tree by adding new elements, this is could be more efficient for the Sparse Merkle Tree if done in batch, i.e. all the elements at the same time. This is because the top layers of the tree are shared and will very likely be overwritten on each insert, if the elements are added in batch, we can write the code to hash only what is needed once. probabilities
edit: batching would also make sense for the case where leafs from one tree are copied into another. #82 (comment) and to implement deletes #85 (comment)
When a user inserts a new element into a Sparse Merkle Tree, if that tree has a conflict, this operation becomes two inserts (the old element has to be pushed down a level, and the new element has to be inserted).
This causes hashing of the complete merkle path twice, once for each update. It may make sense to batch operations and remove the extra unnecessary work (in the above example, that would be the hashing from the common parent up to the root)
The text was updated successfully, but these errors were encountered: