Why Submitted URL has Crawl Issue:
Search console shows errors if submitted url or requested to index or when sitemaps file gets updated on your website, Google bot crawls submitted url to index in google and if google bot faces any difficulties while crawling or any issue like server error (5xx level error) or 404 page which is not expected response for Google bot while crawling or blocked by robots or noindex meta tag (will not index) or google bot is unable to crawl that url, then search console fires up this error saying submitted url has crawl issues and lists large number of urls some time in search console.
What is Submitted URL has Crawl Issue in Search Console:
Submitted URL has a crawl issue means that the url was unable to fetch by google bot due to some reasons and the reasons of submitted url crawl issue fails depends up on the url which is submitted by webmasters or site owners.
How to fix Submitted URL has crawl issue
If you Check with the error list of Submitted url has a crawl then the first thing you need to do is to Inspect the url provided by Search console and do a url inspection and see errors list and solve the issue.
Submitted URL has Crawl Issue with 404 Page (URL):
404 pages which throw error in google search console as submitted url has crawl issue and will be listed in index coverage excluded section in search console. The reason for throwing this crawl error is because, google bot tried to crawl a url from sitemaps or with any other sources available on web and in return google bot found the url with a status of 404 instead of 200 ok response code, so the following url will be listed in search console section submitted url has crawl issue.
How to Fix Submitted URL has Crawl issue 404 page:
You submitted a page to google which is not a valid url (200 ok response) its a valid 404 page(http status code – 404). If URL is a 404 it should not be listed in sitemap files or any other internal linking structure if your website is following . Due to some technical errors it is common that few of the url can be found sometimes as working urls as 404, so google reports these urls in search console assuming that when google bot tried to fetch the url which is not 404 is now reporting as 404.
You should update your sitemaps with the latest list of urls which are working fine and remove 404 urls. You can submit sitemaps to google in many number of ways, like search console, or you can directly ping google via a url.
Since it returned 404 page instead of 200 ok response, it is valid to throw error in submitted url has crawl issue. To get rid of this error or fix submitted url has crawl error with status 404, you need to add a 301 redirect 404 url to related url, if you have any related url present or you can leave it as it is so that google bot can remove the following url from google index when next time google crawls it.
Its fine ok with google if its 404 page as 404 pages does not affect website traffic or SERP listing. Google bot is smart enough to pick up all the urls. It will only let you know whats happening with your website and what are the pages or urls google bot crawled and found errors. Previously with google webmaster tools it was not supposed to report the same way it is reporting in search console.
New search console is updated version of old google search console and new search console is little bit technical SEO and google webmaster was only with the performance and other errors.
Reasons Why Submitted URL has Crawl Issue
1. If the URL is pointing to a non-existing page
2. URL is not accessible 403
3. Submitted url has crawl issue coz its blocked by robots.txt
4. Submitted url has 5xx error issue
5. High Server response rate
6. URL submitted is taking too long time as expected to return 200 response
7. Server issues
8. DNS errors
10. 410 page
11. 404 Page
12. Resources taking too long time
13. Thin content
14. Too many ads making no sense in submitted to index
Related Search Console Articles:
1. Fix Server Error 5xx Search Console
2. Fix Xml Sitemap 404 Error in Google Search console (Webmaster Tools)
3. How to Fix Google Critical Mobile Usability Errors in Search Console
4. How to Fix Crawl Errors in Google Webmaster Tools
5. How to Fix URLs Blocked by Robots.txt File in Google Search Console
Run a fetch for a url under Submitted URL has Crawl Issue then check with the header response if its 4xx level response or 5xx error level response and fix the issue.
Make sure that url is not throwing 404 error or its not accessible by Google bot or blocked by google, if it is 404 then URL will not be indexed in google or thin content will not index as well.
If url throws 404 implement 301 redirect to an existing page.
If submitted url returns 5xx error level make sure it returns a 200 ok response, if submitted url has a crawl issue is blocked by robots.txt or with no-index tag then make sure it is crawled and indexable by google bot in google search engine ranking.
How to access List of Submitted URL has Crawl Issue in Search Console:
To access the urls of Submitted url has a crawl issue all you need to do is go to search console index coverage section -> and check all the boxes like valid pages, errors pages, excluded pages and then scroll down and you will see a section named submitted url has crawl issue with number of urls count.
Just click on submitted url has crawl issue and you will get a list of submitted url’s has crawl issue errors list and perform inspecting live url as well to see the problem with the submitted url’s has crawl issue and fix error and make it available for google bot if the errors is blocked by robots.txt file or forbidden error and if it’s a 404 error in submitted url has crawl issue make sure you do a 301 redirect to an existing or related pages and make that submitted url returns a proper 200 ok response header without no-index meta tag on your webpage.