How to Fix URLs Blocked by Robots.txt File in Google Search Console

How to Fix URLs Blocked by Robots.txt File in Google Search Console:

In google search console if you see a message urls blocked by robots.txt and want to fix easily and explore why google says urls blocked by robots.txt error messages that fires up getting when you log in to google search console.

How to check urls blocked by robots.txt:

If you login to google search console then you have a crawl section and under the crawl section tab in google search console named robots.txt tester to test and validate robots.txt and this is a great tool to identify the urls blocked by robots.txt using this tester and it tells you which section of urls are blocked by robots.txt in google search console.

In robots.txt section you can quickly see that if a particular url is been blocked or not and you can test it here against different user agents below:

1. For general web search: Google bot

2. For Google Image: Image bot

3. For Google New: Google Bot News

4. For Google Bot Video: Google Bot video

5. For Mobile: Google Bot Mobile

6. For Google Media: Google Bot Mediapartners

7. For Ads Bot: Google Bot Ads

Related Coverage:

1. Fix Yoast Sitemap Not Showing And Generating

2. General Http Error in Sitemap Couldn’t Fetch in Search Console

3. How to Fix 502 Bad Gateway Error Timeout Issue

4. How to Fix Soft 404 Errors in Search Console

5. Importance of Google Crawling and Indexing in SEO

Now, Paste the url which google says urls blocked by Robots.txt and then press test with a specific general google bot as a user agent selected. If url is blocked then google it says url is blocked in red color or if url is allowed it says allowed in green color this means everything is ok.

If url is blocked and shows in red color and it will tell you which directive here is causing the url block and if you see in a red color and saying url blocked then here is how to fix below:

How to fix URLs Blocked by Robots.txt:

Login to your webhosting manager and go to file manager and locate robots.txt file and download the robots.txt file and open with your editor and edit robots.txt file and simply delete that file or make changes according to your robots.txt file and make sure that you allow google bot for crawling that directive if you are making changes to robots.txt file.

After making necessary changes to your robots.txt file upload the changes made robots.txt file and test with robots.txt file and that should be allowed in green color so when next time google bot comes it will crawl your website and everything will be ok and the url changes you made wont be blocked by google anymore.

How Long Does it takes to Update Robots.txt file

If you updated your robots.txt file then it typically takes 24-48 hours according to google official documentation but changes happens as soon as you make changes and save your robots.txt file but at some times based upon your website performance and server configuration it may take long time for google crawlers to update and crawl your robots.txt file

Alternative Robots.txt Testing Tool and Generator:

Apart from powerful and useful tool provided in google search console there are many alternative robots.txt file testing tools available online and one of them is w3c robots.txt validator which is used by many pro seo to test robots.txt file

Author
Author
Ramana Tula is a Google Product Expert - He is a Full stack Web and Android Developer.

- Advertisement -