Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Do not include blocked pages #164

Merged
merged 2 commits into from
Jun 15, 2023
Merged

Do not include blocked pages #164

merged 2 commits into from
Jun 15, 2023

Conversation

Hlavtox
Copy link
Contributor

@Hlavtox Hlavtox commented Jun 14, 2023

Questions Answers
Description? When generating robots.txt file, Prestashop marks some customer/internal pages as disallowed. Check them out here. This module allows them to be added to the XML sitemap, which results in errors in Search console, about these pages being in the sitemap, but not crawlable.
Type? bug fix
BC breaks? no
Deprecations? no
Fixed ticket? Fixes PrestaShop/PrestaShop#18768
How to test? See that the blocked pages no longer appear in module configuration and in the generated sitemap.

@Hlavtox Hlavtox added this to the 4.3.1 milestone Jun 14, 2023
@MhiriFaten MhiriFaten self-assigned this Jun 15, 2023
Copy link

@MhiriFaten MhiriFaten left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hello @Hlavtox ,

I have checked your PR and blocked pages are no longer included.
It is QA approved ✔️

@nicosomb nicosomb merged commit 949b011 into PrestaShop:dev Jun 15, 2023
@nicosomb
Copy link
Contributor

Thank you @Hlavtox !

@Hlavtox Hlavtox deleted the robots branch July 26, 2023 22:36
@Hlavtox Hlavtox mentioned this pull request Jan 10, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
Archived in project
Development

Successfully merging this pull request may close these issues.

[Google Sitemap] Why does the module allow adding sites blocked by robots?
5 participants