Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

FATAL ERROR: Ineffective mark-compacts near heap limit Allocation failed - JavaScript heap out of memory #407

Open
artechventure opened this issue May 23, 2019 · 5 comments

Comments

@artechventure
Copy link

Do you want to request a feature or report a bug?
bug

What is the current behavior?

Metro bundler build fails when adding aws-sdk-js on react-native 0.59.8.
It fails with a message below :
transform[stderr]: FATAL ERROR: Ineffective mark-compacts near heap limit Allocation failed - JavaScript heap out of memory

If the current behavior is a bug, please provide the steps to reproduce and a minimal repository on GitHub that we can yarn install and yarn test.

  1. Build example project below(initialize with yarn install) with Xcode run button.
  2. Metro bundler shows, and build stops at near 99%, and fails with error message above.
    Describe what you expected to happen:

https://github.com/artechventure/rn59AwsDemo
(Created empty rn project with react-native init, and added only aws-sdk-js)

What is the expected behavior?
Build success and defualt rn screen appears

Please provide your exact Metro configuration and mention your Metro, node, yarn/npm version and operating system.

React Native Environment Info:
System:
OS: macOS 10.14
CPU: (8) x64 Intel(R) Core(TM) i7-4770HQ CPU @ 2.20GHz
Memory: 2.16 GB / 16.00 GB
Shell: 5.3 - /bin/zsh
Binaries:
Node: 10.15.3 - ~/.nvm/versions/node/v10.15.3/bin/node
Yarn: 1.16.0 - ~/.nvm/versions/node/v10.15.3/bin/yarn
npm: 6.9.0 - ~/.nvm/versions/node/v10.15.3/bin/npm
Watchman: 4.9.0 - /usr/local/bin/watchman
SDKs:
iOS SDK:
Platforms: iOS 12.1, macOS 10.14, tvOS 12.1, watchOS 5.1
Android SDK:
API Levels: 25, 27, 28
Build Tools: 27.0.3, 28.0.2, 28.0.3
IDEs:
Android Studio: 3.2 AI-181.5540.7.32.5056338
Xcode: 10.1/10B61 - /usr/bin/xcodebuild
npmPackages:
react: 16.8.3 => 16.8.3
react-native: 0.59.8 => 0.59.8
npmGlobalPackages:
react-native-cli: 2.0.1
metro: 0.51.1

@4RGUS
Copy link

4RGUS commented Oct 3, 2019

I am also having the same issue. @artechventure Have you found a solution for this?
Thanks!

@CalderBot
Copy link

As of RN 0.59.0 you can increase the heap using NODE_ARGS by modifying
xcode -> project -> build phases -> Bundle React Native code and images -> shell to:

export NODE_BINARY=node
export NODE_ARGS='--max_old_space_size=8192'
../node_modules/react-native/scripts/react-native-xcode.sh

See #22421

As pointed out in https://medium.com/@ttqluong93/react-native-max-old-space-size-2af6754b5926 , modifying app/build.gradle as follows fixes the analogous Android issue:

project.ext.react = [
    entryFile: "index.js",
    nodeExecutableAndArgs: ["node", "--max-old-space-size=8192"]
]

@johnnywang
Copy link

I'm encountering this now as well, and it seems like bumping metro-react-native-babel-preset to anything above 0.73.5 will cause this (even though 0.73.6 seems to be a no-op as far as I can tell). Changing the heap size doesn't help either, as the node process will just continue to eat whatever available memory is there until crashing.

@robhogan
Copy link
Contributor

robhogan commented Sep 9, 2023

I'm encountering this now as well, and it seems like bumping metro-react-native-babel-preset to anything above 0.73.5 will cause this (even though 0.73.6 seems to be a no-op as far as I can tell). Changing the heap size doesn't help either, as the node process will just continue to eat whatever available memory is there until crashing.

Interesting - is it a reliable repro that switching between 0.73.5 and 0.73.6 of the preset makes the difference? Would you be willing to share your package.json and lock file (yarn.lock) in each case?

@johnnywang
Copy link

Hey @robhogan, it seems like as is always the case, trying to find a simpler repro case for this just does not resurface the issue. I can consistently repro the problem in my broken branch, but if I reapply the exact same codes to a new branch, it works fine. I suspect my yarn.lock somehow got into a weird state in the broken branch, and something about the bump to 0.73.6 caused it to break, so this was probably just a weird buggy state issue, sorry!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

5 participants