Google webmasters cancel robot.txt Noindex
Tech News Technology

Google Webmasters will no longer support Noindex in robot.txt
Google Webmasters has announced to cancel the robot.txt noindex support. Also, Google has open-sourced the robot.txt parser code

Google announces to cancel support for robot.txt Noindex directives. In a Google blog, the webmasters officially announce to end support for directive related to indexing. The news was announced today on 2nd July 2019. The announcement has been made a day after open-sourcing robot.txt parser. Yesterday Google announced that they’re making the robot.txt parser code open source.

Google webmasters cancel the robot.txt Noindex directive

Google made the announcement on Webmasters’ official twitter handle.

They shared a blog link attached to the tweet. In the Google blog, they explained everything related to the announcement. Publishers and bloggers have time to remove noindex and choose an alternative. According to the blog, publishers can remove the noindex directive until September 1, 2019. The publishers can use alternatives to control indexing. Here are 5 ways mentioned in the blog.

  1. Noindex in robot meta tags.
  2. 404 & 410 HTTP status codes.
  3. Password protection.
  4. Disallow in robot.txt
  5. Search Console Remove URL tool.

Google makes the robot.txt parser open source

Huawei ban lifted – US companies can now continue business with Huawei

The announcement comes a day after Google shares the robot.txt parser code. The robot.txt parser code is now open source. Crawlers and developers can now read and use the code. They’ve made the process of GoogleBot public. Now other crawlers and coders can learn how GoogleBot reads robot.txt files.