Skip to content

Assess need to update with changes as standardization continues #4

@benjaminestes

Description

@benjaminestes

https://developers.google.com/search/reference/robots_txt#what-changed

  • Removed the "Requirements Language" section in this document because the language is Internet draft specific.
  • Robots.txt now accepts all URI-based protocols.
  • Google follows at least five redirect hops. Since there were no rules fetched yet, the redirects are followed for at least five hops and if no robots.txt is found, Google treats it as a 404 for the robots.txt. Handling of logical redirects for the robots.txt file based on HTML content that returns 2xx (frames, JavaScript, or meta refresh-type redirects) is discouraged and the content of the first page is used for finding applicable rules.
  • For 5xx, if the robots.txt is unreachable for more than 30 days, the last cached copy of the robots.txt is used, or if unavailable, Google assumes that there are no crawl restrictions.
  • Google treats unsuccessful requests or incomplete data as a server error.
    "Records" are now called "lines" or "rules", as appropriate.
  • Google doesn't support the handling of elements with simple errors or typos (for example, "useragent" instead of "user-agent").
  • Google currently enforces a size limit of 500 kibibytes (KiB), and ignores content after that limit.
  • Updated formal syntax to be valid Augmented Backus-Naur Form (ABNF) per RFC5234 and to cover for UTF-8 characters in the robots.txt.
  • Updated the definition of "groups" to make it shorter and more to the point. Added an example for an empty group.
  • Removed references to the deprecated Ajax Crawling Scheme.

Metadata

Metadata

Assignees

Labels

questionFurther information is requested

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions