Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add Post-Training Quantization and export function in dygraph mode #50107

Merged
merged 9 commits into from
Feb 16, 2023

Conversation

wanghaoshuang
Copy link
Contributor

@wanghaoshuang wanghaoshuang commented Jan 31, 2023

PR types

New features

PR changes

APIs

Describe

  1. Add Post-Training Quantization
    1.1 Abstract some functions from QAT to Quantization class
    1.2 Add Post-Training Quantization by extending Quantization class
    1.3 Add observers for PTQ
    1.4 Add unittest for PTQ
  2. Add exporting function for QAT and PTQ

TODO

动态图量化功能重构子任务:

PTQ示例

原始模型:

LeNetDygraph(
  (features): Sequential(
    (0): Conv2D(1, 6, kernel_size=[3, 3], padding=1, data_format=NCHW)
    (1): ReLU()
    (2): MaxPool2D(kernel_size=2, stride=2, padding=0)
    (3): Conv2D(6, 16, kernel_size=[5, 5], data_format=NCHW)
    (4): ReLU()
    (5): MaxPool2D(kernel_size=2, stride=2, padding=0)
  )
  (fc): Sequential(
    (0): Linear(in_features=400, out_features=120, dtype=float32)
    (1): Linear(in_features=120, out_features=84, dtype=float32)
    (2): Linear(in_features=84, out_features=10, dtype=float32)
  )
)

转换为PTQ模型:

LeNetDygraph(
  (features): Sequential(
    (0): QuantedConv2D(
      (weight_quanter): AbsmaxObserverLayer()
      (activation_quanter): AbsmaxObserverLayer()
    )
    (1): ObserveWrapper(
      (_observer): AbsmaxObserverLayer()
      (_observed): ReLU()
    )
    (2): ObserveWrapper(
      (_observer): AbsmaxObserverLayer()
      (_observed): MaxPool2D(kernel_size=2, stride=2, padding=0)
    )
    (3): QuantedConv2D(
      (weight_quanter): AbsmaxObserverLayer()
      (activation_quanter): AbsmaxObserverLayer()
    )
    (4): ObserveWrapper(
      (_observer): AbsmaxObserverLayer()
      (_observed): ReLU()
    )
    (5): ObserveWrapper(
      (_observer): AbsmaxObserverLayer()
      (_observed): MaxPool2D(kernel_size=2, stride=2, padding=0)
    )
  )
  (fc): Sequential(
    (0): QuantedLinear(
      (weight_quanter): AbsmaxObserverLayer()
      (activation_quanter): AbsmaxObserverLayer()
    )
    (1): QuantedLinear(
      (weight_quanter): AbsmaxObserverLayer()
      (activation_quanter): AbsmaxObserverLayer()
    )
    (2): QuantedLinear(
      (weight_quanter): AbsmaxObserverLayer()
      (activation_quanter): AbsmaxObserverLayer()
    )
  )
)

模型导出示例

image

@paddle-bot
Copy link

paddle-bot bot commented Jan 31, 2023

你的PR提交成功,感谢你对开源项目的贡献!
请关注后续CI自动化测试结果,详情请参考Paddle-CI手册
Your PR has been submitted. Thanks for your contribution!
Please wait for the result of CI firstly. See Paddle CI Manual for details.

1. Add Post-Training Quantization
1.1 Abstract some functions from QAT to Quantization class
1.2 Add Post-Training Quantization by extending Quantization class
1.3 Add observers for PTQ
1.4 Add unittest for PTQ
2. Add exporting function for QAT and PTQ
1. Add ConvertibleQuantedLayer class to help converting quantized layer
2. Add code to quantize weights and remove quanter
@wanghaoshuang wanghaoshuang changed the title Add Post-Training Quantization in dygraph mode Add Post-Training Quantization and export function in dygraph mode Feb 1, 2023
yghstill
yghstill previously approved these changes Feb 8, 2023
ceci3
ceci3 previously approved these changes Feb 8, 2023
Copy link
Contributor

@ceci3 ceci3 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

@wanghaoshuang wanghaoshuang dismissed stale reviews from ceci3 and yghstill via 6b2b932 February 9, 2023 01:24
ceci3
ceci3 previously approved these changes Feb 10, 2023
chenwhql
chenwhql previously approved these changes Feb 10, 2023
Copy link
Contributor

@chenwhql chenwhql left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM for setup.py.in

zhangbo9674
zhangbo9674 previously approved these changes Feb 10, 2023
Copy link
Contributor

@zhangbo9674 zhangbo9674 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM for setup

@wanghaoshuang wanghaoshuang dismissed stale reviews from zhangbo9674 and chenwhql via 02cb25e February 14, 2023 08:56
jeff41404
jeff41404 previously approved these changes Feb 15, 2023
Copy link
Contributor

@jeff41404 jeff41404 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM for APIs

Copy link
Contributor

@jzhang533 jzhang533 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

Copy link
Contributor

@zhangbo9674 zhangbo9674 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Approve for setup

@wanghaoshuang wanghaoshuang merged commit b703025 into PaddlePaddle:develop Feb 16, 2023
@wanghaoshuang wanghaoshuang deleted the ptq branch February 16, 2023 06:57
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

7 participants