Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

【static】modify backward prune logic for EmptygradOpMaker #53746

Merged
merged 31 commits into from
May 16, 2023
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
31 commits
Select commit Hold shift + click to select a range
7bdf59c
add rules
xiaoguoguo626807 Apr 27, 2023
bbdf3f6
Merge branch 'develop' of https://github.com/PaddlePaddle/Paddle into…
xiaoguoguo626807 May 4, 2023
f67d10f
modify no kernel yaml parse
xiaoguoguo626807 May 4, 2023
2763cb8
fix_conflict
xiaoguoguo626807 May 4, 2023
eaac527
success op generate
xiaoguoguo626807 May 5, 2023
72a6ef5
Merge branch 'develop' of https://github.com/PaddlePaddle/Paddle into…
xiaoguoguo626807 May 6, 2023
7198b0d
success test_silu_double
xiaoguoguo626807 May 8, 2023
9c2bb06
modify bug
xiaoguoguo626807 May 8, 2023
0a2dc96
modify static error
xiaoguoguo626807 May 9, 2023
d317192
Merge branch 'develop' of https://github.com/PaddlePaddle/Paddle into…
xiaoguoguo626807 May 9, 2023
eb6e539
modify silu_grad input
xiaoguoguo626807 May 9, 2023
095d590
fix conflict
xiaoguoguo626807 May 11, 2023
c13db26
modify kernel signature
xiaoguoguo626807 May 11, 2023
6ac4947
modify kernel signature
xiaoguoguo626807 May 11, 2023
94469e8
Merge branch 'develop' of https://github.com/PaddlePaddle/Paddle into…
xiaoguoguo626807 May 11, 2023
a0903a2
code style
xiaoguoguo626807 May 11, 2023
c9110f3
Merge branch 'develop' into silu_double_grad
xiaoguoguo626807 May 11, 2023
e85f951
Merge branch 'develop' of https://github.com/PaddlePaddle/Paddle into…
xiaoguoguo626807 May 11, 2023
1879741
code style
xiaoguoguo626807 May 11, 2023
455e545
review
xiaoguoguo626807 May 12, 2023
af4f4c1
delete opinfo modify
xiaoguoguo626807 May 12, 2023
debc77e
modify gradOpMaker
xiaoguoguo626807 May 12, 2023
c0ced95
modify gradOpMaker
xiaoguoguo626807 May 12, 2023
19635e8
merge and modify silu_double_grad kernel signature
xiaoguoguo626807 May 12, 2023
357497f
modify silu_double_grad rules
xiaoguoguo626807 May 12, 2023
5046cef
Merge branch 'develop' of https://github.com/PaddlePaddle/Paddle into…
xiaoguoguo626807 May 12, 2023
06ac607
Merge commit 'refs/pull/53605/head' of https://github.com/PaddlePaddl…
xiaoguoguo626807 May 15, 2023
04f3408
modify genarated-j2
xiaoguoguo626807 May 15, 2023
f4e4576
Merge branch 'develop' of https://github.com/PaddlePaddle/Paddle into…
xiaoguoguo626807 May 15, 2023
8eeb455
add approve rules
xiaoguoguo626807 May 15, 2023
c0a8190
modify aytograd_functional_static_test
xiaoguoguo626807 May 16, 2023
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 2 additions & 0 deletions paddle/fluid/framework/op_info.h
Original file line number Diff line number Diff line change
Expand Up @@ -97,6 +97,8 @@ class OpInfo {
return grad_op_maker_ != nullptr && !use_empty_grad_op_desc_maker_;
}

bool HasEmptyGradOpMaker() const { return use_empty_grad_op_desc_maker_; }

const DygraphGradOpMakerFN& DygraphGradOpMaker() const {
// Normally, proto_ should not be null, except some special operators, such
// as LeaklyReluDoubleGrad op.
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -477,6 +477,7 @@ REGISTER_OPERATOR({{name}}, ops::{{name | to_pascal_case}}Op,
{% set backward_name = op["backward"] %}
ops::{{backward_name | to_pascal_case}}OpMaker<paddle::framework::OpDesc>,
ops::{{backward_name | to_pascal_case}}OpMaker<paddle::imperative::OpBase>,
{% elif "forward" in op %}
{% else %}
Comment on lines +480 to 481
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

这里的 else 是不忘删了?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

下面还有emptyGradOpMaker的生成分支

paddle::framework::EmptyGradOpMaker<paddle::framework::OpDesc>,
paddle::framework::EmptyGradOpMaker<paddle::imperative::OpBase>,
Expand Down
13 changes: 10 additions & 3 deletions paddle/fluid/pybind/pybind.cc
Original file line number Diff line number Diff line change
Expand Up @@ -1325,8 +1325,7 @@ All parameter, weight, gradient are variables in Paddle.
if ((grad_op_maker == nullptr) && (grad_comp_op_maker == nullptr)) {
// Normally, proto_ should not be null, except some special
// operators, such as LeaklyReluDoubleGrad op.
std::string type =
op_info.proto_ ? op_info.proto_->type() : "unknown";
std::string type = op_desc.Type();
PADDLE_THROW(platform::errors::NotFound(
"Neither operator %s's GradOpMaker nor CompGradOpMaker has "
"been registered.\nPlease check whether (%s) operator has "
Expand All @@ -1348,7 +1347,8 @@ All parameter, weight, gradient are variables in Paddle.
VLOG(3) << "need skip: " << need_skip << std::endl;
if (paddle::prim::PrimCommonUtils::IsBwdPrimEnabled()) {
if ((grad_comp_op_maker != nullptr) && (!need_skip)) {
VLOG(3) << "Runing composite fun for " << op_desc.Type();
VLOG(3) << "Prim Flag Open: Runing composite grad fun for "
<< op_desc.Type();
grad_op_descs = grad_comp_op_maker(op_desc,
no_grad_set,
&grad_to_var,
Expand All @@ -1360,9 +1360,13 @@ All parameter, weight, gradient are variables in Paddle.
}
} else {
if (grad_op_maker != nullptr) {
VLOG(3) << "Prim Flag Close: Runing origin grad fun for "
<< op_desc.Type();
grad_op_descs = grad_op_maker(
op_desc, no_grad_set, &grad_to_var, grad_sub_block);
} else {
VLOG(3) << "Prim Flag Close: Runing composite grad fun for "
<< op_desc.Type();
Comment on lines +1363 to +1369
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

这个LOG级别再降一些把,3 有点高,容易影响到其他地方的调试

Copy link
Contributor Author

@xiaoguoguo626807 xiaoguoguo626807 May 16, 2023

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

next pr will modify #53874

grad_op_descs = grad_comp_op_maker(op_desc,
no_grad_set,
&grad_to_var,
Expand Down Expand Up @@ -1390,6 +1394,9 @@ All parameter, weight, gradient are variables in Paddle.
.Get(op_type)
.HasNonEmptyGradOpMaker();
});
m.def("has_empty_grad_op_maker", [](const std::string op_type) {
return framework::OpInfoMap::Instance().Get(op_type).HasEmptyGradOpMaker();
});
m.def("has_infer_inplace", [](const std::string op_type) {
return framework::OpInfoMap::Instance().Get(op_type).HasInferInplace();
});
Expand Down
6 changes: 3 additions & 3 deletions python/paddle/fluid/backward.py
Original file line number Diff line number Diff line change
Expand Up @@ -2347,7 +2347,7 @@ def _find_op_path_(
for i, op in enumerate(block.ops):
if _some_in_set_(
op.desc.input_arg_names(), input_names
) and core.has_non_empty_grad_op_maker(op.type):
) and not core.has_empty_grad_op_maker(op.type):
for name in op.desc.output_arg_names():
if name not in no_grad_set:
input_names.add(name)
Expand All @@ -2366,7 +2366,7 @@ def _find_op_path_(

if _some_in_set_(
op.desc.output_arg_names(), output_names
) and core.has_non_empty_grad_op_maker(op.type):
) and not core.has_empty_grad_op_maker(op.type):
for name in op.desc.input_arg_names():
if name not in no_grad_set:
output_names.add(name)
Expand All @@ -2381,7 +2381,7 @@ def _find_op_path_(
op.desc.output_arg_names(), output_names
):
relevant_op_flags[i] = True
if core.has_non_empty_grad_op_maker(op.type):
if not core.has_empty_grad_op_maker(op.type):
for name in op.desc.input_arg_names():
if name not in no_grad_set:
output_names.add(name)
Expand Down
2 changes: 1 addition & 1 deletion test/autograd/test_autograd_functional_static.py
Original file line number Diff line number Diff line change
Expand Up @@ -466,7 +466,7 @@ def run_test_by_fullmatrix(self, pd_f, inps, np_hess, batch=False):
def test_square(self):
def pd_f(x):
"""Input is a square matrix."""
return paddle.matmul(x, x.T).flatten().sum()
return paddle.matmul(x, x.T).sum()

def np_hess(x):
dim = x.shape[0]
Expand Down
3 changes: 2 additions & 1 deletion tools/check_file_diff_approvals.sh
Original file line number Diff line number Diff line change
Expand Up @@ -81,6 +81,7 @@ API_FILES=("CMakeLists.txt"
"paddle/phi/core/kernel_context.h"
"paddle/phi/core/infermeta_utils.h"
"paddle/fluid/prim/api/composite_backward/composite_backward_api.h"
"paddle/fluid/prim/api/composite_backward/composite_double_backward_api.h"
"paddle/fluid/prim/api/manual_prim/prim_manual_api.h"
"paddle/fluid/prim/api/api.yaml"
"python/paddle/incubate/autograd/composite_rules.py"
Expand Down Expand Up @@ -207,7 +208,7 @@ for API_FILE in ${API_FILES[*]}; do
elif [ "${API_FILE}" == "paddle/phi/api/include/tensor.h" ] || [ "${API_FILE}" == "paddle/phi/core/tensor_base.h" ] || [ "${API_FILE}" == "paddle/phi/core/dense_tensor.h" ] || [ "${API_FILE}" == "paddle/phi/core/meta_tensor.h" ] || [ "${API_FILE}" == "paddle/phi/core/tensor_meta.h" ] || [ "${API_FILE}" == "paddle/phi/core/attribute.h" ] || [ "${API_FILE}" == "paddle/phi/core/device_context.h" ] || [ "${API_FILE}" == "paddle/phi/core/kernel_utils.h" ] || [ "${API_FILE}" == "paddle/phi/core/kernel_registry.h" ] || [ "${API_FILE}" == "paddle/phi/core/kernel_factory.h" ] || [ "${API_FILE}" == "paddle/phi/core/kernel_context.h" ] || [ "${API_FILE}" == "paddle/phi/core/infermeta_utils.h" ]; then
echo_line="You must have one RD (chenwhql, phlrain, zyfncg, YuanRisheng) approval for changing ${API_FILE} , which manages the underlying code for PaddlePaddle PHI Library.\n"
check_approval chenwhql phlrain zyfncg YuanRisheng
elif [ "${API_FILE}" == "paddle/fluid/prim/api/composite_backward/composite_backward_api.h" ] || [ "${API_FILE}" == "paddle/fluid/prim/api/manual_prim/prim_manual_api.h" ] || [ "${API_FILE}" == "paddle/fluid/prim/api/api.yaml" ]; then
elif [ "${API_FILE}" == "paddle/fluid/prim/api/composite_backward/composite_backward_api.h" ] || [ "${API_FILE}" == "paddle/fluid/prim/api/manual_prim/prim_manual_api.h" ] || [ "${API_FILE}" == "paddle/fluid/prim/api/api.yaml" ] || [ "${API_FILE}" == "paddle/fluid/prim/api/composite_backward/composite_double_backward_api.h" ]; then
echo_line="You must have one RD (JiabinYang, cxxly(chenxiaoxu) , xiaoguoguo626807(wangruting)) approval for changing ${API_FILE} , which manages the code for PaddlePaddle Composite Bacward Prim API.\n"
check_approval 1 JiabinYang cxxly xiaoguoguo626807
elif [ "${API_FILE}" == "python/paddle/incubate/autograd/primitives.py" ] || [ "${API_FILE}" == "python/paddle/incubate/autograd/composite_rules.py" ]; then
Expand Down