fix: file chunking due to mix of nodeSize and chunkSize limitation #176
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
User description
It was setup wrongly when to use
maxChunkSize
andmaxNodeSize
this led to files not processing at all in auto-drivePR Type
Bug fix, Tests
Description
maxChunkSize
andmaxNodeSize
by replacingDEFAULT_MAX_CHUNK_SIZE
withDEFAULT_NODE_MAX_SIZE
across the codebase.Changes walkthrough 📝
chunker.ts
Correct default node size and chunk size handling
packages/auto-dag-data/src/ipld/chunker.ts
DEFAULT_MAX_CHUNK_SIZE
toDEFAULT_NODE_MAX_SIZE
.DEFAULT_NODE_MAX_SIZE
.maxChunkSize
separately.nodes.ts
Use consistent node size default in node creation
packages/auto-dag-data/src/ipld/nodes.ts
DEFAULT_MAX_CHUNK_SIZE
withDEFAULT_NODE_MAX_SIZE
for nodesize parameters.
DEFAULT_NODE_MAX_SIZE
across functions.nodes.spec.ts
Update tests for new default node size
packages/auto-dag-data/tests/nodes.spec.ts
DEFAULT_NODE_MAX_SIZE
instead ofDEFAULT_MAX_CHUNK_SIZE
.