Digital Marketing
Google Updates Robots.txt Policy: Unsupported Fields Now Clearly Ignored
Google has updated its robots.txt policy, clarifying that unsupported fields will be ignored by its crawlers. Website owners should use only documented fields like user-agent, allow, disallow, and sitemap, while reviewing their existing files to ensure compliance with Google’s official guidelines.