[Dy2st]Fix error when set buffer in forward #38540
Merged
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
PR types
Bug fixes
PR changes
Others
Describe
#37296 合入后,下面的示例代码中存在两个问题
对于subnet的self.a,在动转静中是一个buffer变量,在forward中使用语句
self.a = self.a + 10
进行set时,由于取消了Layer.__call__中的param_guard操作,Layer._buffers变量不会被转为Variable,那么在Layer.__setattr__函数中最后会进入下图中的if中,assign产生一个新的variable,_buffers[name]也有原来的tensor变为了assig的variable,同时jit.save时会报错。动转静中self.b变量被设置为None后重新被赋值为一个tensor,会产生1中类似错误,我们在代码中禁止了类似操作。