Skip to content

Conversation

Xia-Weiwen
Copy link
Collaborator

Summary
The test cases for SmoothQuant have the following issues:

  • All cases are skipped in CPU-only environments
  • The test case test_smoothquant_accuracy uses a single linear as the model, but only submodules can be converted to SmoothQuantObservedLinear. So, the single linear is never converted and the case does not actually test the results of SmoothQuant linear. When running with pytest -sv, the following warning messages are printed:
    • convert: module is not SmoothQuantObservedLinear, skipping: <class 'torch.nn.modules.linear.Linear'>

This PR fixes the issues.

Test plan
pytest -sv test/prototype/test_smoothquant.py

Copy link

pytorch-bot bot commented Sep 30, 2025

🔗 Helpful Links

🧪 See artifacts and rendered test results at hud.pytorch.org/pr/pytorch/ao/3101

Note: Links to docs will display an error until the docs builds have been completed.

✅ No Failures

As of commit 389b91d with merge base 5cbbd73 (image):
💚 Looks good so far! There are no failures yet. 💚

This comment was automatically generated by Dr. CI and updates every 15 minutes.

@meta-cla meta-cla bot added the CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed. label Sep 30, 2025
@Xia-Weiwen Xia-Weiwen added topic: not user facing Use this tag if you don't want this PR to show up in release notes and removed CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed. labels Sep 30, 2025
@Xia-Weiwen
Copy link
Collaborator Author

CC @namgyu-youn I cannot add you as a reviewer.

@Xia-Weiwen Xia-Weiwen marked this pull request as ready for review September 30, 2025 02:06
Copy link
Contributor

@jerryzh168 jerryzh168 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM, thanks for the fix

Copy link
Contributor

@namgyu-youn namgyu-youn left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Injecting outliers looks good to me, thanks for the fix.

@Xia-Weiwen
Copy link
Collaborator Author

Injecting outliers looks good to me, thanks for the fix.

Thanks. Actually, the cases may fail if we don't. That means alpha=0.5 is worse than 0 with some random inputs.

@meta-cla meta-cla bot added the CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed. label Sep 30, 2025
Copy link
Contributor

@namgyu-youn namgyu-youn left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think we can reproduce outliers by adjusting ratio and size, please check comments Sorry ratio and size looks good, followings should be removed.

@namgyu-youn
Copy link
Contributor

Thanks. Actually, the cases may fail if we don't. That means alpha=0.5 is worse than 0 with some random inputs.

Yeah outlier is a key to produce SmoothQuant. Thanks for this change

@Xia-Weiwen Xia-Weiwen merged commit 8955739 into pytorch:main Oct 2, 2025
22 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed. topic: not user facing Use this tag if you don't want this PR to show up in release notes
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants