URLs showing blocked by robots


#1

While checking Index status in search console, the number of indexed files have reduced to 2 ad its showing URLs blocked by robots. How can I fix this issue?

Also, in “fetch as google” its only showing 17 records. However, I had already submitted more than 200 URLs.

Please help.


#2

Check your blog robots.txt file and check to disallow links on it.
Ex: yoursitename.com/robots.txt


#3

Some URLs can be blocked by Google, it shouldn’t be an issue, but do check which ones they are though. :slight_smile:

Can you kindly share the “Fetch as Google” records screenshot, please?


#4

All the URLs (old and new) are showing as Redirected even though I have added the separate property and sitemaps for https version of the site. Please see the screenshot.


#5

First, Delete your http version of property from your Google Webmaster home.
image

Next, Create New property with your https version link

Now, Submit your sitemap. Everything will clear.


#6

Thanks Vignesh.

Now when I am submitting URLs for https property, they are getting submitted for indexing but Do I need to add a property for http version again?

Also, If I have submitted the URL for Desktop, do i need to submit same URL for mobile too?


#7

No need to add http version and mobile version. Because http version transfer to https. So, no need.