Google Updates Robots.txt Policy: Unsupported Fields Are Ignored via @sejournal, @MattGSouthern
Google limits robots.txt support to four fields, clarifying its stance on unsupported directives. The post Google Updates Robots.txt Policy: Unsupported Fields Are Ignored appeared first on Search Engine Journal.
Advertisement
Google limits robots.txt support to four fields, clarifying its stance on unsupported directives.
Google only supports four specific robots.txt fields. Unsupported directives in robots.txt will be ignored. Consider auditing your robots.txt files in light of this update.In a recent update to its Search Central documentation, Google has clarified its position on unsupported fields in robots.txt files.
Key Update
Google has stated that its crawlers don’t support fields not listed in its robots.txt documentation.
This clarification is part of Google’s efforts to provide unambiguous guidance to website owners and developers.
Google states:
“We sometimes get questions about fields that aren’t explicitly listed as supported, and we want to make it clear that they aren’t.”
This update should eliminate confusion and prevent websites from relying on unsupported directives.
What This Means:
Stick to Supported Fields: Use only the fields explicitly mentioned in Google’s documentation. Review Existing Robots.txt Files: Audit current robots.txt files to ensure they don’t contain unsupported directives. Understand Limitations: Google’s crawlers may not recognize certain third-party or custom directives.Supported Fields:
According to the updated documentation, Google officially supports the following fields in robots.txt files:
user-agent allow disallow sitemapNotable Omissions:
While not explicitly stated, this clarification implies that Google doesn’t support commonly used directives like “crawl-delay,” although other search engines may recognize them.
Additionally, it’s worth noting that Google is phasing out support for the ‘noarchive‘ directive.
Looking Ahead:
This update is a reminder to stay current with official guidelines and best practices.
It highlights the need to use documented features rather than assuming support for undocumented directives.
Consult Google’s official Search Central documentation for more detailed information on robots.txt implementation and best practices.
Featured Image: Thomas Reichhart/Shutterstock
SEJ STAFF Matt G. Southern Senior News Writer at Search Engine Journal
Matt G. Southern, Senior News Writer, has been with Search Engine Journal since 2013. With a bachelor’s degree in communications, ...