Types of Crawl Errors and How to Fix Crawl Errors in Google Search Console:
Google Search console is the best tool to fix errors, deal with crawl errors and make your website healthy and useful to users, with crawl errors section Google shows errors on your website in this section what is the cause of crawl error.
Crawling is important as Google crawls and indexes based up on crawling a webpage so its important to make sure Google bot crawl every webpage of your website and avoid crawl errors in search console. Crawl error is one of the most popular features in webmaster tools and also with the enhancements as well for last 90 days. In Google search console the left side crawl error section lists the number of errors Google Bot encountered and reports them in crawl section of search console and fix them.
Types of Crawl Errors in Google Search Console :
There are 2 type of crawl errors that will split in two categories in search console :
1. Site Errors
2. Crawl Errors
Site errors are errors that are specific to particular url this affects entire website. Errors in this site errors category include:
1. DNS resolution failures
2. Connectivity issues
3. Web server
4. Problems fetching the robots.txt file.
And site errors will be reported if Google bot fails when trying to fetch the URL.
How to fix Site Errors in Search Console:
When you receive a notification or see an error in search console regarding site errors it divided or distinguished with the error type as well like server connectivity, Robots.txt, and DNS.
If you are facing server level errors make sure your website is crawled successfully by fetching the url by running a fetch as Google.
If it’s a DNS errors make sure your DNS are good and healthy. If it’s a robots.txt file saying can’t fetch or blocked then make sure that your are not blocking by doing a robots.txt test in search console.
URL errors are errors that are specific for a particular page which means when Google Bot tried to crawl your webpage URL, it was able to connect with DNS, connected to the server, and was able to fetch your robots.txt with no issues but when something went wrong after that when trying to fetch separate URL of your website like broken URL and for various reasons and they are categorized based up on what error caused the error and this will be categorized with WEB and MOBILE Errors differently which helps to dig deeper and fix the crawl error in search console and reports URL errors shows up to 1000 URLs specific to that category.
Types of URL Errors in Search Console:
Below are few of the URL error types listed and how to fix crawl errors in search console:
Crawl Error: Server Error
This means there is a server connectivity problem and Google Bot is unable to connect to the server and Google Bot couldn’t access your URL and request time out or your site is busy crawling other WebPages on your website and abounded the request.
Related Coverage SEO search console Article:
How to fix Server Errors in Search Console:
To Fix crawl Error in search console under Server Error section make sure your website is able to handle server requests at ease and your website doesn’t have any server connectivity errors and to fix this make sure server responds as expected by Google bot.
Crawl Error: 404 Not Found
To fix this crawl error implement a 201 redirect In Search console if it’s a 404 error this means that URL is broken and Google Bot requested a URL which does not exists on your website you can safely ignore this 404 errors. Most 404 doesn’t affect your website as there are not part of Google index, too many 404 doesn’t harm your website ranking in Google search results but it tells you how you website URL is crawled and keep your site healthy with no issues in future.
How to Fix crawl Error: 404 Errors in Search Console
You can simply mark them as mark as fixed in search console and these URL doesn’t harm your website.
If you found crawl error in 404 errors section make sure you make that URL working by implementing a 301 redirect to the existing url of your website. If it’s a deleted page returning a 404 is the right thing to do. If 404 errors are not listed from your website then you can simple return 404 is good for your site and you can find more by digging deeper by linked from section under 404 errors listed page section in search console.
Crawl Error URLs Blocked By Robots.txt:
If search console says urls blocked by robots.txt then it means Google bot is unable to crawl URL of the webpage which are blocked by robots.txt and blocked URL will not be crawled by Google bot if instructed to google bot in robots.txt file.
How to fix Crawl Error urls Blocked by Robots.txt:
To solve this error make sure you allow all your urls and css file and js files are allowed to Google and don’t block them to google bot. If you want to test the url listed is blocked or allowed, you can use robots.txt tester available in search console.
Crawl Errors : Access Denied
Access denied is an error which says google bot is not allowed or forbidden to crawl or access denied. Google bot discovers pages and crawls them by url by url. In order to crawl the url google bot needs to access the webpage.
Normally this error access denied says the url on your site because your site requires user to login to view small or all content or your server requires authentication.
To fix crawl error access denied, test it with robots.txt file and make sure you are blocking user sign ups or any other important content of your website.
Crawl Error Soft 404:
Crawl error soft 404 means google bot when a server returns a real page for a url that doesn’t actually exist on your site and server returning 200 ok response code for a page with no content or a page does not exist.
How to Fix Soft 404 Errors in Search console:
Make sure you are returning a 404 header request instead of 200 ok response the best way to check the header status is to run a fetch and render in google search console, if its 200 ok response then everything is good some times it may be due to less and thin content as well reports as soft 404.