Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

OOM when Gradle syncing after updating to 6.0.4 #1031

Closed
sky-ricardocarrapico opened this issue Dec 10, 2021 · 9 comments
Closed

OOM when Gradle syncing after updating to 6.0.4 #1031

sky-ricardocarrapico opened this issue Dec 10, 2021 · 9 comments

Comments

@sky-ricardocarrapico
Copy link

After updating from 5.17.1 to 6.0.4 I started having OOM when doing a Gradle sync for the second time on the same daemon.

Even the first sync seems to take longer than usual.

I ran the hprof through Eclipse Memory Analyzer:
Screenshot 2021-12-10 at 13 36 11

The culprit seems to be related with GitRatchetGradle and VisitableURLClassLoader.

I'm not sure if there is any more useful info that can be extracted from the memory analyzer. Let me know what can I do to help.

Gradle version: 7.3/7.3.1
Spotless version: 6.0.4
OS: macOS 11.6

@nedtwigg
Copy link
Member

Wow! 1.9GB of GitRatchet, that's a catastrophe. Thanks for sharing, will look into this.

@nedtwigg
Copy link
Member

Probably related: #1025.

@nedtwigg
Copy link
Member

This should be improved in plugin-gradle 6.0.5. Can you report back @sky-ricardocarrapico?

@sky-ricardocarrapico
Copy link
Author

sky-ricardocarrapico commented Dec 17, 2021

Doesn't seem to be fixed. Still takes a lot of time on the first sync and OOMs on the second.

On some syncs I had some warnings that raised a few suspicions on my part. I tried not applying the plugin to some modules and actually managed to fix the OOM. Before I was doing this:

configure(subprojects.findAll { !it.path.contains("react") }) {
    apply plugin: "com.diffplug.spotless"
    apply from: "$rootDir/spotless.gradle"
}

I'm filtering those react modules because they are included on the project but I do not control them.

On this project I also have some AAR modules and some hierarchy where a module may not have code but just be the parent of other modules:

:lib:apiguard:aar
:feature:account

:lib, :lib:apiguard and :feature do not have any code, they just group up other modules.

So by excluding these, the sync OOM finally stopped:

configure(subprojects.findAll {
    !it.path.contains("react") && !it.path.contains("aar") && it.childProjects.isEmpty()
}) {
    apply plugin: "com.diffplug.spotless"
    apply from: "$rootDir/spotless.gradle"
}

I have to exclude both AAR and group modules, just excluding the first or the latter doesn't solve it.

The build.gradle for an AAR module is just this:

configurations.maybeCreate("default")
artifacts.add("default", file('libs/apiguard.aar'))

And the group modules don't even have one.

I hope this helps somehow.

@nedtwigg
Copy link
Member

Huh. Wonder if the issue is somehow related to having the target resolve to empty. This is useful info, but without a public repro it's hard to move farther.

In my testing, there was for sure a memory usage regression introduced in 6.0.0, and it was for sure improved in 6.0.5. Happy to reopen with a public repro, totally understand if that's not possible, thanks very much for the info you've provided already!

@sky-ricardocarrapico
Copy link
Author

I'll try to replicate these conditions on a new project. I'm not very hopeful because it's likely that the project size contributes to the OOM and I'm not able to share anything regarding it. But lets see.

Thank you for your time on this issue!

@sky-ricardocarrapico
Copy link
Author

As expected, I was not able to replicate the OOMs on a new project.

I did some more tests on the actual project and nothing about the bug is related with AARs on modules. I don't know how I reached that conclusion the other day. Filtering projects with empty childProjects is enough.

But this got me thinking that we don't even need to use ratchet because the whole codebase already conforms to the rules we apply. Simply removing ratchet decreases the memory consumption significatively and there doesn't seem to be any drawback. spotlessCheck seems to be even faster now, which I find weird. Isn't ratchet supposed to only run lint on the files that changed? As opposed to run it on every file on the codebase.

Were we hitting garbage collection because of ratchet?

org.gradle.jvmargs=-Xmx4096m -XX:MaxPermSize=1024m -XX:+HeapDumpOnOutOfMemoryError -XX:+UseParallelGC -Dfile.encoding=UTF-8

@nedtwigg
Copy link
Member

spotlessCheck seems to be even faster now, which I find weird.

The Gradle up-to-date mechanism runs first. Only the files which Gradle identifies as changed are then checked against the git index.

Because Gradle runs the show, we do the git status file by file, which makes it slower than a normal git status. The point of ratcheting is more about correctness than performance - copyright headers, changing code style bit by bit, etc. If your formatters are slow/expensive enough then a file-by-file git status might speed things up. If the formatters are fast enough, it might be better to skip the git status, which sounds like it is the case for your build.

I'm very surprised by the memory consumption issue. It seems that we must have a bug, but I can't replicate it. We definitely had one bug which we squashed thanks to your report, perhaps the other bug will get squashed in the future :)

@sky-ricardocarrapico
Copy link
Author

The Gradle up-to-date mechanism runs first. Only the files which Gradle identifies as changed are then checked against the git index.

Ah, I was not aware of that. So we actually made the correct decision by dropping ratchet. Everything seems to be much better now.

Since it's likely that the project size plays a part on the bug, I really don't know how I can further help identifying the problem. Let's hope it doesn't affect anyone else.

Anyway, thanks for your help in all of this. Have a happy new year!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

2 participants