We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
您好,请问为什么在class lstm1的forward里用hn_o+hn_1作为relu层的输入,而不直接用hn_1或output?
Originally posted by @mmm656 in #2 (comment)
The text was updated successfully, but these errors were encountered:
您好,请问为什么在class lstm1的forward里用hn_o+hn_1作为relu层的输入,而不直接用hn_1或output? Originally posted by @mmm656 in #2 (comment)
你可以选择直接用hn_1或者output作为输出
Sorry, something went wrong.
您好,请问为什么在class lstm1的forward里用hn_o+hn_1作为relu层的输入,而不直接用hn_1或output? Originally posted by @mmm656 in #2 (comment) 你可以选择直接用hn_1或者output作为输出
但是直接用output作为输出会导致误差很大,可能是陷入了局部最小值,不知道您当初是否基于这样的考虑才换成hn_o+hn_1,请问这样做有没有理论支撑,或者只是出于经验?
No branches or pull requests
Originally posted by @mmm656 in #2 (comment)
The text was updated successfully, but these errors were encountered: