Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

restartable relaxation process for stars #414 #417

Merged
merged 23 commits into from
May 12, 2023
Merged

restartable relaxation process for stars #414 #417

merged 23 commits into from
May 12, 2023

Conversation

danieljprice
Copy link
Owner

@danieljprice danieljprice commented May 11, 2023

Type of PR:
enhanced functionality implementing #414

Description:
If phantomsetup is interrupted half way through a stellar relaxation, or did not fully converge, the process can be restarted by simply running phantomsetup again. Similarly, running phantomsetup will look for matching snapshots from previous relaxation runs and will not repeat the procedure if the star is already relaxed

This P-R also fixes a couple of memory allocation errors encountered when running phantomsetup with DEBUG=yes.

To prevent these from reoccurring phantomsetup is now always compiled with DEBUG=yes in the GitHub actions tests

this required a bunch of things to be fixed in a bunch of different setup routines

Testing:
running phantomsetup using SETUP=binary as described in the docs

Did you run the bots? yes

Did you update relevant documentation in the docs directory? yes

@danieljprice danieljprice changed the title restartable relaxation process for stars restartable relaxation process for stars #414 May 11, 2023
@danieljprice danieljprice merged commit 78e4948 into master May 12, 2023
@danieljprice danieljprice deleted the set_star branch May 12, 2023 07:58
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant