Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[AutoScheduler] Support layout rewrite for whole networks #6987

Merged
merged 12 commits into from
Dec 3, 2020

Conversation

merrymercy
Copy link
Member

@merrymercy merrymercy commented Nov 28, 2020

Auto-scheduler can infer a good layout for constant tensors (e.g., the weight tensors in conv2d, dense) according to the loop structure created by the schedule. This PR adds a relay pass to support this kind of layout rewrite at graph level. This pass inserts necessary layout transforms in relay program and pre-computes them by FoldConstant pass.

Changes overview

  • Add a relay op auto_scheduler_layout_transform and its topi compute definition. This op can handle more general layout transform than the existing relay.op.transform.layout_transform
  • Add a relay pass AutoSchedulerLayoutRewrite to do the layout rewrite.
  • Add a field auto_scheduler_rewritten_layout to Conv2dAttrs to indicate the new layout after rewriting.
  • see python/tvm/topi/nn/conv2d.py for examples of how to enable the layout rewrite for a TOPI compute.

co-authored by @minminsun

@merrymercy merrymercy marked this pull request as ready for review November 29, 2020 16:11
@merrymercy
Copy link
Member Author

@merrymercy
Copy link
Member Author

merrymercy commented Dec 2, 2020

@jcf94 @comaniac @FrozenGene @junrushao1994 Let us merge this as soon as possible, so we can get good performance on CPU and publish the CPU tutorial.
I will do more clean up in the follow-up PRs when I add layout rewrite support for more ops (conv3d, dense, batch matmul).
It is easier to figure out the best way to modify TOPI compute when I add the support for more ops.

@@ -371,8 +379,29 @@ def conv2d_nhwc(Input, Filter, stride, padding, dilation, out_dtype="float32"):
else:
dilation_h, dilation_w = dilation

if auto_scheduler_rewritten_layout:
# Infer shape for the rewritten layout
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

As discussed, we need extract it. If conveniently, we could mark this one todo.


with tempfile.NamedTemporaryFile() as fp:
log_file = fp.name
# log_file = "test_layout_rewrite.json"
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

remove it. If we want to merge it soon, could leave it in the clean up pr.

include/tvm/ir/transform.h Outdated Show resolved Hide resolved
@merrymercy merrymercy merged commit a7bf979 into apache:main Dec 3, 2020
@merrymercy merrymercy deleted the pr-layout-rewrite branch December 3, 2020 02:05
trevor-m pushed a commit to trevor-m/tvm that referenced this pull request Dec 3, 2020
* [AutoScheduler] Add layout rewrite pass in relay

* fix

* fix lint

* fix attrs

* trigger CI

* Apply suggestions from code review

* trigger CI

* Update python/tvm/auto_scheduler/relay_integration.py

* Update python/tvm/auto_scheduler/relay_integration.py

* Update python/tvm/auto_scheduler/compute_dag.py

* Trigger CI

* Apply suggestions from code review
trevor-m pushed a commit to trevor-m/tvm that referenced this pull request Dec 4, 2020
* [AutoScheduler] Add layout rewrite pass in relay

* fix

* fix lint

* fix attrs

* trigger CI

* Apply suggestions from code review

* trigger CI

* Update python/tvm/auto_scheduler/relay_integration.py

* Update python/tvm/auto_scheduler/relay_integration.py

* Update python/tvm/auto_scheduler/compute_dag.py

* Trigger CI

* Apply suggestions from code review
trevor-m pushed a commit to neo-ai/tvm that referenced this pull request Dec 4, 2020
* [AutoScheduler] Add layout rewrite pass in relay

* fix

* fix lint

* fix attrs

* trigger CI

* Apply suggestions from code review

* trigger CI

* Update python/tvm/auto_scheduler/relay_integration.py

* Update python/tvm/auto_scheduler/relay_integration.py

* Update python/tvm/auto_scheduler/compute_dag.py

* Trigger CI

* Apply suggestions from code review
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants