Robots.Txt Noindex Gets Canceled By Google – Convertico Media – News & Marketing Insights

Robots.Txt Noindex Gets Canceled By Google

Industry giant, Google, has officially announced through its twitter page that it will be “saying goodbye to undocumented and unsupported rules in robots.txt”.  A post on Google’s official blog explains that this decision was made “in the interest of maintaining a healthy ecosystem and preparing for potential future open source releases,” thus beginning September 1, 2019  “all code that handles unsupported and unpublished rules (such as noindex directive)” will no longer be supported by Google.

Google Solutions for Page Indexing

However, for those companies who relied on robots.txt to avoid getting its pages crawled by Google has provided some alternatives on its blog:

  1. Noindex in robots meta tags: According to the blog post this is the most effective way to remove your URLs.
  2. 404 and 410 HTTP status Codes: Applying this server not found codes to your pages will prevent the page from being crawled and indexed.
  3. Password Protection: Protecting pages with passwords will stop them from being indexed. On the other hand, if you wish for your pages to be indexed by Google even when password protected you can do this with markup to indicate subscription or paywalled content.
  4. Search Console Remove URL tool: Easily remove your URL from Google search results with the use of this tool.

If your business was relying on these rules, don’t wait until September to make required updates. Instead contact Convertico Media today. Our talented SEO experts will go through your website and make sure that your indexing rules are good.  If you are thinking about building your website or even improving your ranks in search results, we can help you with that too!

In any case, we would love to hear from you! Talk to our team today.