-
Notifications
You must be signed in to change notification settings - Fork 233
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add support for all output activations #310
Conversation
Hi @Rubinjo, thanks for the contribution.
|
Yeah, it is indeed a pretty rigorous change. I mainly made the change not to duplicate the |
I would leave |
Yeah I agree. Preserving |
I have adjusted Now all three points from your earlier message should be satisfied. |
Looks good to me. Sorry for the delay! :) |
I'm working on a binary classification problem and therefore have a sigmoid activation instead of a softmax activation function for my output layer. I have adjusted the
model_wo_softmax
function to accept any kind of activation by giving it as an argument to also cover binary classification problems.See here the old vs the new call:
I have also edited all examples and documentation that cover this function, so everything should now have this new function.