Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

fix(content-pages): add article wrapper around MDXContent #7595

Merged
merged 1 commit into from
Jun 16, 2022
Merged

fix(content-pages): add article wrapper around MDXContent #7595

merged 1 commit into from
Jun 16, 2022

Conversation

matkoch
Copy link
Contributor

@matkoch matkoch commented Jun 11, 2022

Pre-flight checklist

  • I have read the Contributing Guidelines on pull requests.
  • If this is a code change: I have written unit tests and/or added dogfooding pages to fully verify the new behavior.
  • If this is a new API or substantial change: the PR has an accompanying issue (closes #0000) and the maintainers have approved on my working plan.

Motivation

When I create a simple page according to this, the content is not wrapped in an <article> tag... therefore it's also not indexed through the default configuration that I've found.

Test Plan

Test links

Deploy preview: https://deploy-preview-_____--docusaurus-2.netlify.app/

Related issues/PRs

@facebook-github-bot facebook-github-bot added the CLA Signed Signed Facebook CLA label Jun 11, 2022
@Josh-Cena Josh-Cena added the pr: bug fix This PR fixes a bug in a past release. label Jun 11, 2022
@Josh-Cena Josh-Cena changed the title Add article wrapper around MDXContent fix(content-pages): add article wrapper around MDXContent Jun 11, 2022
@netlify
Copy link

netlify bot commented Jun 11, 2022

[V2]

Built without sensitive environment variables

Name Link
🔨 Latest commit 27db6fa
🔍 Latest deploy log https://app.netlify.com/sites/docusaurus-2/deploys/62a3f3a27e6fea00091774fa
😎 Deploy Preview https://deploy-preview-7595--docusaurus-2.netlify.app
📱 Preview on mobile
Toggle QR Code...

QR Code

Use your smartphone camera to open QR code link.

To edit notification comments on pull requests, go to your Netlify site settings.

@github-actions
Copy link

⚡️ Lighthouse report for the deploy preview of this PR

URL Performance Accessibility Best Practices SEO PWA Report
/ 🟠 72 🟢 100 🟢 100 🟢 100 🟢 90 Report
/docs/installation 🟠 88 🟢 100 🟢 100 🟢 100 🟢 90 Report

@slorber
Copy link
Collaborator

slorber commented Jun 16, 2022

hey @matkoch , that looks like a reasonable change yet 👍

BTW where did you find this link? because that repo is archived, and you can now configure yourself the crawler

The recommended config is here: https://docsearch.algolia.com/docs/templates/#docusaurus-v2-template

@slorber slorber merged commit 5fe33be into facebook:main Jun 16, 2022
@matkoch
Copy link
Contributor Author

matkoch commented Jun 18, 2022

@slorber I haven't applied yet for the updated plan / automatic crawling. Currently, I'm still crawling through a Docker container. Are there any other disadvantages?

The config is linked from here: https://docsearch.algolia.com/docs/legacy/run-your-own#create-a-new-configuration

@slorber
Copy link
Collaborator

slorber commented Jun 22, 2022

@shortcuts any idea why the new doc does not mention anymore how to run your own Docker crawler?

@shortcuts
Copy link
Contributor

shortcuts commented Jun 22, 2022

@slorber the documentation is still available under the legacy version, we have a few mentions of it in the current one (mostly in FAQs), but no dedicated section. https://docsearch.algolia.com/docs/legacy/run-your-own

@slorber
Copy link
Collaborator

slorber commented Jun 22, 2022

Thanks.

So what I understand is that running your own scrapper locally is now deprecated without replacement? Are users supposed to all migrate to the online crawler?

Because we link to this page in our doc, wonder if we should keep it: https://docsearch.algolia.com/docs/legacy/run-your-own/

@shortcuts
Copy link
Contributor

So what I understand is that running your own scrapper locally is now deprecated without replacement?

We won't maintain or provide new features to it, but accept/review community contributions if any. We also won't break the records shape required to make DocSearch UI (frontend) work, so this scraper can still be recommended and used.

Are users supposed to all migrate to the online crawler?

For users that are eligible to the DocSearch program, yes

Because we link to this page in our doc, wonder if we should keep it

It's a good alternative (or starting point) for users that don't have access to the Algolia Crawler, and it correctly crawls website and return records so I'd say it's worth keeping it!

@slorber
Copy link
Collaborator

slorber commented Jun 22, 2022

Ok thanks :)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
CLA Signed Signed Facebook CLA pr: bug fix This PR fixes a bug in a past release.
Projects
None yet
Development

Successfully merging this pull request may close these issues.

5 participants