The “new” Google Search Console rolled out to webmasters in 2018. You may have received a message called “Introducing the new Search Console (beta)” when this occurred or read about it on Google’s blog here.

Since then, a scary warning has gone out to thousands of websmasters. That warning, or error, reads “New Index coverage issue detected for site www.YourDomain.com” and says a new issue was found such as “Submitted URL blocked by robots.txt”.

It looks like this:

screenshot

What does “New Index coverage issue detected for site” mean?

This could mean anything, but in most cases your site map lists a file or folder such as “/images/” which is blocked by your robots.txt file. The robots.txt file blocks search engines from crawling parts of a website. In many cases a robots.txt file will block all sorts of things such as an admin login screen. Wannabe hackers are always scanning the web for these pages and keeping them out of search results prevents unnecessary hack attempts.

It is however always wise to go ahead and click on “Fix index coverage issues” to see which resources have been blocked. Every time I have seen this message, what was blocked was one single irrelevant file or folder and I completely ignored the message. One error like this is not going to cause your rankings to tank in search or harm your web presence in any way, shape or form.

Len

12 Responses

    1. In your case, you have some URLs blocked by robots.txt. There are just a couple of them, and it looks intentional. This is likely what Google is referring to.

  1. Thanks for sharing such a great blog… I am impressed with you taking time to post a nice info.

  2. That’s an informative article. Upscale your business of sending greeting cards or gift box sets to loved ones by learning from special Illustration designed to help you grow as an artist and create a unique vast collection of your own.

Leave a Reply

Your email address will not be published. Required fields are marked *