Some webmasters have recently told what the Google Sitemaps report of Google Search Console displays. It actually has to do with blocked URLs. So, the report shows more than 11,000 blocked URLs. They turned out to be the victims of robots.txt. The blocking was a sort of warning from Google. However, webmasters would like to know why the fresh index report of the fresh Google Search Console hasn’t reported on these glitches yet.
John Mueller from Google has come up with a response. He tweeted that the fresh report isn’t going to report on a mistake on sample URLs at the level of the sitemap submission. Mueller pointed out that these are sample web addresses, which have been tried before being finally submitted to the key search engine’s indexing. It has been carried out at the sitemap submission, thus it would not be in the indexing report of the fresh SC.