-
Notifications
You must be signed in to change notification settings - Fork 15
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Use Elemental_jll #54
Conversation
Yes. That will certainly help. cc @simonbyrne |
It's still not completely clear to me how this would work with a custom HPC-MPI. Could you please elaborate on what would happen if you try to install Elemental on a cluster with a custom MPI version? |
I think in that case we need to allow people to provide their own. My thought is that either you use BB all the way, or you compile your MPI dependencies on your own (say use spack), but it isn't a good idea for us to try and provide an in-between thing where we facilitate building your own Elemental using system libraries. |
I agree. The problem is that there isn’t really an effective mechanism for specifying which library to use. The Overrides.toml file is too limited to be of much use. |
We perhaps need @staticfloat or @Keno to sketch out a solution to this. We can then figure out how to go about doing it. Essentially in the BB |
d938432
to
d7583ff
Compare
I'm not really sure what's up with CI but @ViralBShah you can try out this branch if you like, I think it should work identically to master branch. I was probably overzealous in deleting a bunch of the CI setup stuff, since we still have to use MPI.jl in the tests. |
I'll also note that despite the build failing here, it's not actually different from master -- master is setting the exit code improperly (with |
If the same tests as master are failing, how about mark them as expected to fail or disable and get this all merged? |
Let's not merge before tests pass. If there are issues with master CI then let's fix that. |
I've fixed the tests in #62 so please rebase and hopefully tests will pass here as well. Update. I've resolved the conflict so rebasing shouldn't be necessary. |
I've no idea why we are getting this error https://travis-ci.org/github/JuliaParallel/Elemental.jl/jobs/681894268#L1211-L1225. Any ideas? |
Co-authored-by: Andreas Noack <andreas@noack.dk>
Co-authored-by: Andreas Noack <andreas@noack.dk>
How do I get Elemental to pick up Elemental_jll on the other Julia processes? I am using the released jll and the dev version of Elemental.jl on your branch. (edited)
|
Did you run |
As in should I do |
Uh, sorry, misread the question. That's strange 😕 |
There's also still this MPI build error here that seems like a bug in the MPI build script. |
I wonder if the other Julia processes are only using the package UUID, and so are getting the current released (without JLL) version of Elemental? Is the manifest used here? I'm not really sure. |
Do we want to keep the |
Pinging @simonbyrne in here (since he may be interested to follow the progress and issues in here.) |
Hmm, not sure what the problem is. I'll try to spend some time finishing up JuliaParallel/MPI.jl#367 next week. |
The Mac issues are fixed, but I don't know what to make of this:
|
Potentially a race condition in precompilation? |
Hm, that's a good point -- it's importing a test-only dependency so the |
We could also just disable precompilation in CI since it's also probably slowing down the run. |
Looks like that part is working now at least. It's trying to run a bare |
In order to merge this, do we essentially need to fix this one? |
In order for tests to pass, I think so. |
There still seems to be some race conditions in CI sometimes, but I think we can probably squash + merge this now? |
Go for it! |
See CI results: This does not work 😦
Lots of segfaults happening. Not sure if they're because of the MPI version being used to run the tests, or the way that Elemental_jll was built, or what...
I wonder if JuliaParallel/MPI.jl#367 would help, since then the MPI launching the tests would be the same as Elemental is linked against (I think).