You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
{{ message }}
This repository has been archived by the owner on Jan 15, 2020. It is now read-only.
I can think of a use for the robots.txt - to exclude all files on pebblecode-staging.herokuapp.com and pebblecode-sandbox.herokuapp.com, so search engines don't duplicate their content
Robots.txt is also used to avoid what is known as “canonicalization” problems or having multiple “canonical” URLs. This problem is sometimes referred to incorrectly as a “duplicate content” problem.
I read that you can just leave it out if you want to allow everything, and submitting the sitemap to google webmasters seemed to be ok, but no harm adding it in. Have added it in http://pebblecode.com/robots.txt
Sign up for freeto subscribe to this conversation on GitHub.
Already have an account?
Sign in.
Add one.
The text was updated successfully, but these errors were encountered: