Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

add JointPrimaryMarginalizedModel likelihood model #4505

Merged
merged 60 commits into from
Mar 28, 2024

Conversation

WuShichao
Copy link
Contributor

@ahnitz This PR implements a likelihood model for multiband PE.

@@ -599,3 +602,273 @@ def _loglikelihood(self):
for stat in self.extra_stats_map[lbl]:
setattr(self._current_stats, stat, mstats[stat.subname])
return logl


class MultibandRelativeTimeDom(HierarchicalModel):
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It's not really time to the RelativeTimeDom, right? It's just tied to a primary model / secondary model scenario, no?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I named it this way because the primary model uses RelativeTimeDom, and also emphasizes the multiband purpose. I know you want a more general likelihood model though.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@WuShichao I don't think this is tied to the relative model, so I think the name still needs to be fixed. (it is tied to models that do marginalization).

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@ahnitz Can you suggest a name? This multiband likelihood relies on the primary model's RelativeTimeDom, and has specific code to do the marginalization for LISA. So it's not a likelihood model for general usage, right?

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It doesn't is just relies on it doing vector maginalization. Nothing explicitly requires that either be a LISA model or even be specific models, just that the primary use vector marginalization at some point.

It could be used for general useage if for example you want to do joint EM + GW analysis. In that case, sky marginalization would still be useful and one could just use your code with modified configuration files.

Copy link
Contributor Author

@WuShichao WuShichao Dec 13, 2023

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@ahnitz OK, then let's name it PrimaryMarginalizedJointModel, do you think it's better?

# remove all marginalized parameters from the top-level model's
# `variable_params` and `prior` sections
primary_model = '%s__model' % kwargs['primary_lbl'][0]
marginalized_params = cp.get(primary_model,
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

You should avoid hardcoding what the set of marginalizations are.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I have made this part more general. Please check.

for i in range(nums):
current_params_other.update(
{key: value[i] for key, value in margin_params.items()})
if 'distance' in self.primary_model.static_params:
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

See if you can make this more general

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I have made this part more general. In order not to pass the primary model's f_lower to LISA, I need to add it into that if check.

@WuShichao WuShichao requested a review from ahnitz November 13, 2023 18:05
if not prefix.endswith('__'):
prefix += '__'
try:
for lbl, model in self.submodels.items():
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@ahnitz I also modified this write_metadata method, previously I just copied it from that MultiSignalModel, but I noticed there is a main difference between my model and it: doesn't like MultiSignalModel, my submodels don't share the same detector data, so I have to save all lognl. Is this part correct?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@ahnitz Just checked, the numbers saved in the final posterior file makes sense:
image
image

@WuShichao WuShichao requested a review from ahnitz March 1, 2024 22:25

margin_names = list(
self.primary_model.marginalize_vector_params.keys())
margin_names.remove('logw_partial')
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Do you need this removal? What happens if you don't do this?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@ahnitz Those are related to updating parameters in the LISA model, so there is no need for logw_partial in the LISA model.

Copy link
Member

@ahnitz ahnitz left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@WuShichao If you are now confident this works in your tests, you may merge. Please see the few remaining comments to see if they can be easily addressed, otherwise go ahead.

@WuShichao WuShichao requested a review from ahnitz March 27, 2024 20:04
Copy link
Member

@ahnitz ahnitz left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@WuShichao Add a comment as we discussed and then merge once the tests pass

@WuShichao WuShichao enabled auto-merge (squash) March 28, 2024 15:03
@WuShichao WuShichao merged commit 93de71f into gwastro:master Mar 28, 2024
31 of 33 checks passed
spxiwh added a commit to spxiwh/pycbc that referenced this pull request Apr 3, 2024
GarethCabournDavies pushed a commit to GarethCabournDavies/pycbc that referenced this pull request Apr 5, 2024
* Update __init__.py

* Update hierarchical.py

* Update relbin.py

* Update tools.py

* Update hierarchical.py

* Update relbin.py

* Update base_data.py

* Update hierarchical.py

* Update hierarchical.py

* Update base_data.py

* Update relbin.py

* Update relbin.py

* Update hierarchical.py

* Update hierarchical.py

* Update hierarchical.py

* Update relbin.py

* fix cc issues in hierarchical.py

* fix cc issues in __init__.py

* fix cc issues in relbin.py

* add progress bar for testing

* add reconstruct method

* fix cc issues

* fix cc issue

* rename, fix typo, combine resconstruct method

* remove unused import

* fix cc issue

* update reconstruct

* fix cc issues

* fix cc issue

* move others_lognl to my model

* add comments

* follow Alex's suggestion

* fix cc issue

* update

* store primary model label

* add update back

* dynesty works again

* update

* fix

* it works

* remove print

* update

* fix cc issue

* Update hierarchical.py

* it works

* Update hierarchical.py

* Update hierarchical.py

* Update hierarchical.py
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants