-
Notifications
You must be signed in to change notification settings - Fork 357
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
add JointPrimaryMarginalizedModel likelihood model #4505
Conversation
@@ -599,3 +602,273 @@ def _loglikelihood(self): | |||
for stat in self.extra_stats_map[lbl]: | |||
setattr(self._current_stats, stat, mstats[stat.subname]) | |||
return logl | |||
|
|||
|
|||
class MultibandRelativeTimeDom(HierarchicalModel): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It's not really time to the RelativeTimeDom, right? It's just tied to a primary model / secondary model scenario, no?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I named it this way because the primary model uses RelativeTimeDom
, and also emphasizes the multiband purpose. I know you want a more general likelihood model though.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@WuShichao I don't think this is tied to the relative model, so I think the name still needs to be fixed. (it is tied to models that do marginalization).
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@ahnitz Can you suggest a name? This multiband likelihood relies on the primary model's RelativeTimeDom
, and has specific code to do the marginalization for LISA. So it's not a likelihood model for general usage, right?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It doesn't is just relies on it doing vector maginalization. Nothing explicitly requires that either be a LISA model or even be specific models, just that the primary use vector marginalization at some point.
It could be used for general useage if for example you want to do joint EM + GW analysis. In that case, sky marginalization would still be useful and one could just use your code with modified configuration files.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@ahnitz OK, then let's name it PrimaryMarginalizedJointModel
, do you think it's better?
# remove all marginalized parameters from the top-level model's | ||
# `variable_params` and `prior` sections | ||
primary_model = '%s__model' % kwargs['primary_lbl'][0] | ||
marginalized_params = cp.get(primary_model, |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
You should avoid hardcoding what the set of marginalizations are.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I have made this part more general. Please check.
for i in range(nums): | ||
current_params_other.update( | ||
{key: value[i] for key, value in margin_params.items()}) | ||
if 'distance' in self.primary_model.static_params: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
See if you can make this more general
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I have made this part more general. In order not to pass the primary model's f_lower
to LISA, I need to add it into that if
check.
if not prefix.endswith('__'): | ||
prefix += '__' | ||
try: | ||
for lbl, model in self.submodels.items(): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@ahnitz I also modified this write_metadata
method, previously I just copied it from that MultiSignalModel
, but I noticed there is a main difference between my model and it: doesn't like MultiSignalModel
, my submodels don't share the same detector data, so I have to save all lognl
. Is this part correct?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@ahnitz Just checked, the numbers saved in the final posterior file makes sense:
|
||
margin_names = list( | ||
self.primary_model.marginalize_vector_params.keys()) | ||
margin_names.remove('logw_partial') |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Do you need this removal? What happens if you don't do this?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@ahnitz Those are related to updating parameters in the LISA model, so there is no need for logw_partial
in the LISA model.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@WuShichao If you are now confident this works in your tests, you may merge. Please see the few remaining comments to see if they can be easily addressed, otherwise go ahead.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@WuShichao Add a comment as we discussed and then merge once the tests pass
)" This reverts commit 93de71f.
* Update __init__.py * Update hierarchical.py * Update relbin.py * Update tools.py * Update hierarchical.py * Update relbin.py * Update base_data.py * Update hierarchical.py * Update hierarchical.py * Update base_data.py * Update relbin.py * Update relbin.py * Update hierarchical.py * Update hierarchical.py * Update hierarchical.py * Update relbin.py * fix cc issues in hierarchical.py * fix cc issues in __init__.py * fix cc issues in relbin.py * add progress bar for testing * add reconstruct method * fix cc issues * fix cc issue * rename, fix typo, combine resconstruct method * remove unused import * fix cc issue * update reconstruct * fix cc issues * fix cc issue * move others_lognl to my model * add comments * follow Alex's suggestion * fix cc issue * update * store primary model label * add update back * dynesty works again * update * fix * it works * remove print * update * fix cc issue * Update hierarchical.py * it works * Update hierarchical.py * Update hierarchical.py * Update hierarchical.py
@ahnitz This PR implements a likelihood model for multiband PE.