You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
There are also a large number of transforms defined in foundry/glm/glm.py:66-120. Have these been validated? Can we assume that we're actually doing MLE of the parameters in the GLM (assuming penalty=0.), or are we optimizing for some other objective?
The text was updated successfully, but these errors were encountered:
Most of the distributions in torch.distributions are supported by just creating your own "Family" instance. Adding support for other aliases (eg being able to pass a string "gamma" to Glm) is also super easy -- feel free to create a branch/PR for this.
I'm not sure I understand your second question, but yes I believe the optimized params should be the MLEs if penalty=0
I thought about this in a059f63 but there's actually two complications:
We should try to make these aliases use "location" parameterizations, meaning the default (only predicting the first-parameter of the distribution with covariates) allows predicting the mean of the distribution. This led to the current implementation of NegativeBinomial vs. the default. I think this should be pretty easy with gamma, but haven't done it yet.
Is there a plan to add support for a gamma distribution GLM in the future?
The current list of supported GLM distributions are:
['bernoulli', 'binomial', 'categorical', 'multinomial', 'poisson', 'negative_binomial', 'exponential', 'weibull', 'gaussian', 'ceiling_weibull']
There are also a large number of transforms defined in
foundry/glm/glm.py:66-120
. Have these been validated? Can we assume that we're actually doing MLE of the parameters in the GLM (assumingpenalty=0.
), or are we optimizing for some other objective?The text was updated successfully, but these errors were encountered: