URLs showing blocked by robots


While checking Index status in search console, the number of indexed files have reduced to 2 ad its showing URLs blocked by robots. How can I fix this issue?

Also, in “fetch as google” its only showing 17 records. However, I had already submitted more than 200 URLs.

Please help.


Check your blog robots.txt file and check to disallow links on it.
Ex: yoursitename.com/robots.txt


Some URLs can be blocked by Google, it shouldn’t be an issue, but do check which ones they are though. :slight_smile:

Can you kindly share the “Fetch as Google” records screenshot, please?


All the URLs (old and new) are showing as Redirected even though I have added the separate property and sitemaps for https version of the site. Please see the screenshot.


First, Delete your http version of property from your Google Webmaster home.

Next, Create New property with your https version link

Now, Submit your sitemap. Everything will clear.


Thanks Vignesh.

Now when I am submitting URLs for https property, they are getting submitted for indexing but Do I need to add a property for http version again?

Also, If I have submitted the URL for Desktop, do i need to submit same URL for mobile too?


No need to add http version and mobile version. Because http version transfer to https. So, no need.


Update your sitemap & check robots.txt error in new beta search console. what is showing there?