Sitemap not being fetched by Google

Sitemaps are not so important if the Google bot is not blocked (robots.txt)
because Google is more trusting in the bot than in the sitemaps that people submitted.
It’s important that Google knows about your domain and that you added it to the GSC.
Very soon the bot will come and dig through.


Everything seems fine and the Google bot can fetch your sitemap.xml
test - https://search.google.com/test/rich-results/result?id=RKqUxeehL9cAx1QcQtA1cg

and I checked your robots.txt and nothing is blocked there - from the URLs you have in the sitemap


GSC often has problems even when everything is fine on your part
and this has been going on for quite some time (since they introduced the new version)

Sometimes it just reports can't fetch and that actually means that the bot hasn’t come to index yet and has a plan to come in the next 2 weeks or a few months.

sometimes… if your domain is quite fresh then they (google) don’t have DNS data yet
so it also knows to throw out a problem.

It might also help if you insert this part of the code into the head section on each page you want the bots to index
<meta name="robots" content="index,follow">

Unless I overlooked something… I believe that you simply have to wait for the mercy of Google, that in a few weeks or longer it will suddenly throw out that it has indexed your sitemaps and found everything it needs.


You can only check if you added the domain to GSC with httpS or only with http or with www or without

4 Likes