You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
hello ,thank you for your code!
But I have a question about your code.The episode in your code seems to be no such operation in the paper and in the soft mask branch only skip connection have addition operation.Could you help me solve this question? out_interp = self.interpolation1(out_middle_2r_blocks) + out_down_residual_blocks1
The text was updated successfully, but these errors were encountered:
I refer to the caffe version. u can consider it as a trick.
hi, thanks for your code, when I use 'ResidualAttentionModel_92_32input_update', I will have question as this issus described because of 'AttentionModule_stage1_cifar', but change model to 'ResidualAttentionModel_92', it work, but I cannot pretrained gived model because of lots of mismatch, do you have a good way to load gived model? or is any other pretrained model can use? Thanks!
hello ,thank you for your code!
But I have a question about your code.The episode in your code seems to be no such operation in the paper and in the soft mask branch only skip connection have addition operation.Could you help me solve this question?
out_interp = self.interpolation1(out_middle_2r_blocks) + out_down_residual_blocks1
The text was updated successfully, but these errors were encountered: