Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

add robots.txt #904

Merged
merged 3 commits into from
Dec 7, 2021
Merged

add robots.txt #904

merged 3 commits into from
Dec 7, 2021

Conversation

geoffcline
Copy link
Contributor

1. Issue, if available:
lack of /robots.txt file reduces SEO and increases 404 errors

2. Description of changes:
adds robots.txt based on kubernetes/website.

3. Does this change impact docs?

  • Yes, PR includes docs updates
  • Yes, issue opened: link to issue
  • No

By submitting this pull request, I confirm that my contribution is made under the terms of the Apache 2.0 license.

@geoffcline geoffcline added the documentation Improvements or additions to documentation label Dec 3, 2021
@geoffcline geoffcline self-assigned this Dec 3, 2021
@netlify
Copy link

netlify bot commented Dec 3, 2021

✔️ Deploy Preview for karpenter-docs-prod ready!

🔨 Explore the source changes: 69fa393

🔍 Inspect the deploy log: https://app.netlify.com/sites/karpenter-docs-prod/deploys/61ab0d46457aaa00076db4f4

😎 Browse the preview: https://deploy-preview-904--karpenter-docs-prod.netlify.app

Copy link
Contributor

@akestner akestner left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Couple questions


Disallow: /v0.4.3-docs/

Allow: /docs/
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Should this also include the homepage?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

sure

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

i've read https://developers.google.com/search/docs/advanced/robots/robots_txt and don't really understand allow. it just seems to say google may index it, but it's also seems that everything not explictly disallowed is indexed.

@@ -6,6 +6,7 @@ baseURL = "/"
disableKinds = ["taxonomy", "taxonomyTerm"]

# Language settings
enableRobotsTXT = true
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Does this mean the robots.txt that's committed was generated by Hugo?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

no. it just means that the committed robots.txt makes it into the live site.

@@ -0,0 +1,8 @@
User-agent: *

Disallow: /v0.4.3-docs/
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is going to get stale. We need a longer term solution to avoid updating this on every release. Perhaps we can simply allow /docs.

@geoffcline geoffcline merged commit 5af4820 into aws:main Dec 7, 2021
@geoffcline geoffcline deleted the gdc-robots-txt branch December 7, 2021 18:04
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
documentation Improvements or additions to documentation
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants