Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

ability to async read chunk data , fix leak data error for react native #5522

Closed
wants to merge 1 commit into from

Conversation

Flexoman
Copy link

@Flexoman Flexoman commented Nov 21, 2024

preparing the solution to fix the data leak error for react native

Copy link
Contributor

github-actions bot commented Nov 21, 2024

Diff output files
diff --git a/packages/@uppy/aws-s3/lib/HTTPCommunicationQueue.js b/packages/@uppy/aws-s3/lib/HTTPCommunicationQueue.js
index 29c5f4b..1b71b04 100644
--- a/packages/@uppy/aws-s3/lib/HTTPCommunicationQueue.js
+++ b/packages/@uppy/aws-s3/lib/HTTPCommunicationQueue.js
@@ -295,7 +295,7 @@ export class HTTPCommunicationQueue {
     };
     for (;;) {
       throwIfAborted(signal);
-      const chunkData = chunk.getData();
+      const chunkData = await chunk.getData();
       const {
         onProgress,
         onComplete,
@@ -401,7 +401,7 @@ async function _nonMultipartUpload2(file, chunk, signal) {
     },
   ).abortOn(signal);
   let body;
-  const data = chunk.getData();
+  const data = await chunk.getData();
   if (method.toUpperCase() === "POST") {
     const formData = new FormData();
     Object.entries(fields).forEach(_ref2 => {
diff --git a/packages/@uppy/aws-s3/lib/MultipartUploader.js b/packages/@uppy/aws-s3/lib/MultipartUploader.js
index e104fdc..8d79dc3 100644
--- a/packages/@uppy/aws-s3/lib/MultipartUploader.js
+++ b/packages/@uppy/aws-s3/lib/MultipartUploader.js
@@ -208,9 +208,13 @@ function _initChunks2() {
     _classPrivateFieldLooseBase(this, _chunks)[_chunks] = Array(arraySize);
     for (let offset = 0, j = 0; offset < fileSize; offset += chunkSize, j++) {
       const end = Math.min(fileSize, offset + chunkSize);
-      const getData = () => {
+      const getData = async () => {
         const i2 = offset;
-        return _classPrivateFieldLooseBase(this, _data)[_data].slice(i2, end);
+        return this.options.getChunkData
+          ? this.options.getChunkData(i2, end, {
+            chunkSize,
+          })
+          : _classPrivateFieldLooseBase(this, _data)[_data].slice(i2, end);
       };
       _classPrivateFieldLooseBase(this, _chunks)[_chunks][j] = {
         getData,
@@ -228,7 +232,7 @@ function _initChunks2() {
     }
   } else {
     _classPrivateFieldLooseBase(this, _chunks)[_chunks] = [{
-      getData: () => _classPrivateFieldLooseBase(this, _data)[_data],
+      getData: async () => _classPrivateFieldLooseBase(this, _data)[_data],
       onProgress: _classPrivateFieldLooseBase(this, _onPartProgress)[_onPartProgress](0),
       onComplete: _classPrivateFieldLooseBase(this, _onPartComplete)[_onPartComplete](0),
       shouldUseMultipart,
diff --git a/packages/@uppy/aws-s3/lib/index.js b/packages/@uppy/aws-s3/lib/index.js
index 88fd867..b6831cb 100644
--- a/packages/@uppy/aws-s3/lib/index.js
+++ b/packages/@uppy/aws-s3/lib/index.js
@@ -649,6 +649,7 @@ function _uploadLocalFile2(file) {
         return _this.uppy.log(...arguments);
       },
       getChunkSize: this.opts.getChunkSize ? this.opts.getChunkSize.bind(this) : undefined,
+      getChunkData: this.opts.getChunkData ? this.opts.getChunkData.bind(this) : undefined,
       onProgress,
       onError,
       onSuccess,

@Murderlon
Copy link
Member

Hi, can you explain more what the problem is you were facing and why we need a new option for it?

@Flexoman
Copy link
Author

Flexoman commented Nov 28, 2024

@Murderlon hi , the idea itself is to control memory leak when uploading large file ie. 2 GB using multipart upload and avoid using large blob, the maximum created blob in memory has to equal chunk size so the example of realization will look

// add file to uppy 
`uppy.add({ size: 2Gb, originalPath: 'some_path' })`

 // use aws s3 callback 
uppy.use(Aws, {
 ....
 getChunkData: ({ start, end }) => {
   const b64 =  RNFS.read(path, start, size 'base64')
    // or use native java method to convert path to Blob using  IOstream // for Android device similar for IOS 
   return new Blob([b64], { type }); 
 }
})

@Murderlon
Copy link
Member

What do you mean with "control memory leak"? Why isn't uppy.addFile() sufficient? If there is a memory leak than we should fix that and not move the problem to the implementer?

@Murderlon Murderlon closed this Jan 15, 2025
@Flexoman
Copy link
Author

Flexoman commented Jan 20, 2025

@Murderlon
When using Uppy in React Native to develop an Android APK, there's a strict limit on the maximum heap size per application. Consequently, uploading large files such as videos or zip files exceeding approximately 500MB to 700MB causes the application to crash due to memory leaks.

As a result, we've opted to remove Uppy from our mobile project and implement a custom uploader instead. However, there's a plan to reintegrate Uppy in a future major version update, pending necessary fixes.

--
The memory leak occurs because the entire file is loaded into a Blob object, causing the APK to allocate memory for the entire file.

@Murderlon
Copy link
Member

By default multipart uploads in 100MB chunks, not the entire file, and you can lower that number if you wish or use regular uploads. I still don't see how your option changes this. In any case, this responsibility should not be moved to the implementer but it should 'just work' in React Native.

Unfortunately I don't have experience with RN but if you can offer me alternatives that would be great.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants