-
Notifications
You must be signed in to change notification settings - Fork 979
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
add robots.txt #904
add robots.txt #904
Conversation
✔️ Deploy Preview for karpenter-docs-prod ready! 🔨 Explore the source changes: 69fa393 🔍 Inspect the deploy log: https://app.netlify.com/sites/karpenter-docs-prod/deploys/61ab0d46457aaa00076db4f4 😎 Browse the preview: https://deploy-preview-904--karpenter-docs-prod.netlify.app |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Couple questions
|
||
Disallow: /v0.4.3-docs/ | ||
|
||
Allow: /docs/ |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Should this also include the homepage?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
sure
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
i've read https://developers.google.com/search/docs/advanced/robots/robots_txt and don't really understand allow
. it just seems to say google may index it, but it's also seems that everything not explictly disallowed is indexed.
@@ -6,6 +6,7 @@ baseURL = "/" | |||
disableKinds = ["taxonomy", "taxonomyTerm"] | |||
|
|||
# Language settings | |||
enableRobotsTXT = true |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Does this mean the robots.txt that's committed was generated by Hugo?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
no. it just means that the committed robots.txt makes it into the live site.
@@ -0,0 +1,8 @@ | |||
User-agent: * | |||
|
|||
Disallow: /v0.4.3-docs/ |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is going to get stale. We need a longer term solution to avoid updating this on every release. Perhaps we can simply allow /docs
.
1. Issue, if available:
lack of /robots.txt file reduces SEO and increases 404 errors
2. Description of changes:
adds robots.txt based on kubernetes/website.
3. Does this change impact docs?
By submitting this pull request, I confirm that my contribution is made under the terms of the Apache 2.0 license.