Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Inaccurate comparison #29

Closed
niftylettuce opened this issue Oct 20, 2017 · 10 comments
Closed

Inaccurate comparison #29

niftylettuce opened this issue Oct 20, 2017 · 10 comments

Comments

@niftylettuce
Copy link

Literally have two of the same screenshots and its telling me that the hex color is different and therefore test fails since they're not equal.

@niftylettuce
Copy link
Author

I've also tried setting ignoreAntialiasing: true and still didn't work. Screenshots are exactly the same.

@chrisdeely
Copy link
Contributor

can you supply the screenshots for analysis?

@niftylettuce
Copy link
Author

I've tested this on tons of websites, with the exact same screenshots taken. They all have the same issue. I'm taking my screenshots with puppeteer btw. Perhaps you could add that as a test case.

@chrisdeely
Copy link
Contributor

Ok, but can you attach any two of these PNGs so we can attempt to debug the issue?

@niftylettuce
Copy link
Author

I'm pretty sure this is puppeteer taking screenshots of the same page but different rendering somehow for gradients or something. Nothing to do with the screenshots.

@niftylettuce
Copy link
Author

@chrisdeely basically I can run the same tests multiple times, some pass, some fail, it's very sporadic

@niftylettuce
Copy link
Author

Closing this, filed puppeteer/puppeteer#1103

@chrisdeely
Copy link
Contributor

Maybe you can reproduce the issue with a publicly available site?
Also take a look at my PR #30 which implements a transparency option which will allow you to create a diff image with all of the matching pixels faded out. That may help point out any minute differences

@TimDaub
Copy link

TimDaub commented Feb 4, 2021

I think I've found the same problem. See:

Generated on Mac OS: https://i.imgur.com/7kAWuMX.png
Generated with GitHub Action (Ubuntu): https://i.imgur.com/BvGxuWM.png

Generated on Mac OS: https://i.imgur.com/PAgszk9.png
Generated with GitHub Action (Ubuntu): https://i.imgur.com/LeICbqQ.png

looksSame deems that they're different, even when setting ignoreAntialiasing: true.

Edit:

By using Arial and text-rendering: geometricPrecision;, I think I've achieved further similarity between GH Actions and Mac OS:

Mac OS: https://i.imgur.com/QIpzyg4.png
GH Action: https://i.imgur.com/GIOs4JP.png

Still looksSame with ignoreAntialiasing: true isn't recognizing the two images as equal.

@DudaGod
Copy link
Member

DudaGod commented Feb 4, 2021

I think I've found the same problem. See:

Generated on Mac OS: https://i.imgur.com/7kAWuMX.png
Generated with GitHub Action (Ubuntu): https://i.imgur.com/BvGxuWM.png

Generated on Mac OS: https://i.imgur.com/PAgszk9.png
Generated with GitHub Action (Ubuntu): https://i.imgur.com/LeICbqQ.png

looksSame deems that they're different, even when setting ignoreAntialiasing: true.

Try to open pictures in adjacent browser tabs and switch between them. In that case you will see that used font is different and it's not an antialiasing problem.

By using Arial and text-rendering: geometricPrecision;, I think I've achieved further similarity between GH Actions and Mac OS:

Mac OS: https://i.imgur.com/QIpzyg4.png
GH Action: https://i.imgur.com/GIOs4JP.png

Still looksSame with ignoreAntialiasing: true isn't recognizing the two images as equal.

It still not a problem in antialiasing. Linux and Mac render fonts in different way and you will always see difference between them. If you want that looks-same says that these images are equal you should increase tolerance. But I recommend not to do it and compare images captured from the same os.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants