Discovered currently not indexed is a common issue when you publish an article on your website then you can see the discovered currently not indexed when google crawl your page. You can fix this issue of discovered currently not indexed on your own by just improving your page and Optimize your website for crawling and indexing.
In simple words Discovered currently not indexed means that google bot successfully crawled for the first time and discovered your website pages but they are currently not indexed in google search you can read more here from google. Discovered currently not indexed means google discovered your url for the first time and url is not indexed if url will get to crawl another time then it will be crawled currently not indexed and url are in queue in getting indexed in google search and if url meets the search quality guidelines without any errors then google will be indexing those pages in google search and list of urls listed in discovered currently not indexed and crawled currently not indexed will be decreased.
If you have recently made your website live and your website is pretty new to google or you have recently submitted your sitemap to google or recently published your article or page then you can expect to see the urls in index coverage status -> Excluded page -> Discovered Currently Not indexed and to get rid of this discovered currently not indexed url – you need to wait some time and improve your website and link your website internally good.
You can ask google to re-crawl and submit url to google using request indexing tool by using url inspection tool and check for any possible errors and if you find any issues then you need to fix those issues.
To get a list of all discovered currently not indexed you need to login to your search console account and select your website property on the left side bar and click on overview section -> Click on coverage section on left side pane-> Now on the right side pane -> click on excluded section and uncheck -> Error, validate with warning -> now scroll down and click on discovered currently not indexed section to get a list of all discovered currently not indexed errors.
Now you have accessed list of discovered currently not indexed pages and you need to fix all these error and pages listed under discovered currently not indexed. So, lets see them in detail below.
Note: For each and every website the solution to fix this discovered currently not indexed will be different.
You need to review all the pages listed in discovered currently not indexed section and most of the time discovered currently not indexed pages you will see thin content, copy content, pruning content or you are having similar pages already on your website which is having no user value pages.
Add more content in discovered currently not indexed pages and make your urls more unique and answering question to the post and then google will index your page and once you are done submit your url again to discovered currently not indexed pages to google via url inspection tool and request indexing.
If your page meets the requirements then google will take that page and index in google search and url will get disappear from discovered currently not indexed pages list in search console.
1- First make sure that you’re not accidentally generating too many URLs, which makes server overload to crawl and list them in discovered and currently they are not crawled means they are in queue for crawling.
2 – improve page content and make it unique and most useful content for users.
3 – Avoid similar pages which are already having same content and improve your page content by adding more unique information to make it much stronger or more useful than other website pages.
Related SEO Articles
1. How to Fix Crawl Errors in Google Webmaster Tools
2. Fetch as Google Temporarily Unreachable Error In Google Search Console
3. How to Fix URLs Blocked by Robots.txt File in Google Search Console
4. How to Fix 502 Bad Gateway Error Timeout Issue
5. How to Fix Soft 404 Errors in Search Console
Google bot discovered your urls but not able to crawl your urls and possible reasons may be -> server overload or google bot rescheduled the crawling process or google bot knows about your page but don’t want to crawl your page to get it indexed and this could be due to pruning content, not unique content, spamming content, not useful content for users. So, in-order to get crawled and indexed you need to make your discovered currently not indexed pages content more unique and most useful for users nor for increasing urls by providing thin content or etc.
The other possible reason would be you are submitting or publishing too many pages at a time and asking google to crawl and indexed by submitting url to google then discovered currently not indexed pages will be increased.
If there are too many similar pages on your website with only title and description changes with thin content then they will be listed in discovered currently not indexed section.
Always try to make your website URLs easily discoverable by Google Bot and crawl successfully without any server errors or soft 404 or return with no content and if google thinks that google already indexed have and first of all if you’re really seeing 99 percent of those pages not being indexed. I would first of all perhaps look at some of the technical things, as well. So in particular that you’re not accidentally generating URLs with kind of differing URL patterns, Where it’s not a matter of us not indexing your content pages but just getting kind of lost in this, I don’t know, jungle of URLs that all look very similar but they’re the subtly different. So things like the parameters that you have in your URL, upper lower case, all of these things can lead to essentially duplicate content. And if we’ve discovered a lot of these duplicate URLs, we might think well we don’t actually need to crawl all of these duplicates because we have some variation of this page already in there.
If you see a URLs in this section discovered currently not indexed then list out those URLs and check with the URL inspection tool and see if you see any errors with URL inspection tool. If you get an error from the specific pages just fix those errors and let Google bot crawl via search console by using live url inspect tool and make changes for the pages and Request Indexing.
Yes! both are quite a bit same with regards to not indexed in google search results. Discovered means google bot know about urls and crawled currently not indexed means google bot crawled your url but not indexed.