I am writing to report an issue with my sitemap. I have submitted my sitemap to Google Search Console, but it is showing the status “Couldn’t fetch”. I have checked the sitemap and it is valid and in the correct location.
I have tried the following troubleshooting steps:
I have checked the URL of the sitemap and it is correct.
I have made sure that the sitemap is publicly accessible.
I have waited for a few days to see if the issue would resolve itself, but it has not.
I am not sure what else to do. Could you please help me troubleshoot this issue? or should I contact google? Even when scanning with some sitemap scanners it says that my sitemap is not valid format but it is correctly located in my domains /sitemap.xml format.
Sitemaps are not so important if the Google bot is not blocked (robots.txt)
because Google is more trusting in the bot than in the sitemaps that people submitted.
It’s important that Google knows about your domain and that you added it to the GSC.
Very soon the bot will come and dig through.
and I checked your robots.txt and nothing is blocked there - from the URLs you have in the sitemap
GSC often has problems even when everything is fine on your part
and this has been going on for quite some time (since they introduced the new version)
Sometimes it just reports can't fetch and that actually means that the bot hasn’t come to index yet and has a plan to come in the next 2 weeks or a few months.
sometimes… if your domain is quite fresh then they (google) don’t have DNS data yet
so it also knows to throw out a problem.
It might also help if you insert this part of the code into the head section on each page you want the bots to index <meta name="robots" content="index,follow">
Unless I overlooked something… I believe that you simply have to wait for the mercy of Google, that in a few weeks or longer it will suddenly throw out that it has indexed your sitemaps and found everything it needs.
You can only check if you added the domain to GSC with httpS or only with http or with www or without