Would you mind if your site has errors? If your site cannot be accessed, obviously your content cannot be crawled too. As a prerequisite, Google examines statuses of your site's DNS, Hosting Server and Robots.txt before step into crawling process.
You can click on each button to access reported data for the period of last 90 days.
If Googlebots couldn't communicate with the DNS server properly, Google won't be able to access your site. So you will be notified about the DNS issues being experienced here.
Common causes for connectivity errors are that your server is completely down or it's busy enough to respond to the Googlebot, may be due to exceeded bandwidth limit. In addition, there could be server configuration issues which occurs conflicts with Googlebot too.
Googlebots look for robots.txt before it starts crawling your site to make sure if Googlebots were allowed to crawl and what pages were disallowed from indexing. If Googlebots are blocked from crawling your site or robots.txt is inaccessible, Google won't crawl your site at all.
Wanna see robots.txt file of your site? View your site's robots.txt by appending /robots.txt to your site address - i.e: www.mayura4ever.com/robots.txt
Not found - Not found errors will be occurred if Google trying to crawl a page that not existing on your site. Mostly it could be a page that removed from your site. Further Googlebots may visit your pages from external sites via backlinks.
If someone has linked to a page removed from your site or misspelled the URL, it will lead to a non-existing page which occurs the response code 404 aka not found.
Reviewing not found errors is a good opportunity for you to find if someone links to a non-existing page on your site. Why not, it may coming from your own site. Another way to catch some broken links.
Just click on Not found box and you will able to see URLs been identified as non-existent.
Further, clicking on each URL will allow you to explore more details about the error and how Googlebots found that URL.
Jump to Linked from tab to find out backlinks pointing to that specific URL. Warning: You will find some broken links ;) You can click Mark as fixed button after fixing the issue for URL to be disappeared from the list of Not found URLs.
Other - Other errors would be the errors experienced by Google other than server and not found errors. Still they were preventing Google from crawling your pages. For example, the protected content where it requires user credentials to access the content.
If your site has pages not allowed for public and appears beneath Other, you can ignore such URLs.
Most importantly, you can view number of pages being crawled per day for last 90 days. Further it will allow you to access download information related to the crawling process.
Once you test out robots.txt against an URL, the results will be shown below.
Fetch as Google
You can click on individual fetch statuses to view how Googlebot fetched the particular page too. A detailed report.
How many pages are indexed by Google right now? You are most curious to know.
● A plugin / gadget or a code snippet you have added to your site is acting as an active malicious software
● Someone else taking control of your site and adding malicious content. Simply we'd say, you site has been hacked.