Hey, i have verified and connected my website with google,everything is fine, but since months ago i submit a sitemap into google search console, it always says: Sitemap could not be read or Couldn’t Fetch , it’s in the root (htdocs) and everything is okay sitemap is generated with XML SITEMAPS, i wonder whats’s the problem here?
Hi
since google introduced a new version of GSC many people have a problem with general error, including me
https://productforums.google.com/forum/#!topic/webmasters/iE6Rig2kYD8
you also need to know that google cache errors and keeps it until the bot again comes in periodic order to crawl your site
sitemaps are not so important if the google bot is not blocked (robots.txt)
because google is more trusting in the bot than in the sitemaps that people submitted
for other problems
Manage your sitemaps using the Sitemaps report - Search Console Help
https://support.google.com/webmasters/answer/44231
Thanks! yeah i have included google robots in my website so i think it’ll be fine.
I forgot to tell you that is recommended to tell the bots location of sitemap in robots.txt
User-agent: *
Allow: /
Sitemap: https://oxydac.cf/sitemap.xml
yeah i already did that but no difference ):
you will not see a change in the GSC but it helps the bots to find it
for me, this is almost a year and google has not yet solved that bug…
but the most important thing is that you have added a domain to the GSC and google now knows about it (sooner or later the bot will come and regularly crawling your site)
I’ve switched to the old version of the GSC (left menu at bottom)
and it is clear that nothing is blocked and that it can load sitemaps
steps in older version:
something has gone crazy on the google side
paradox - everything works and the system responds that it does not work
It’s all good on the old GSC and i test it its fine but after 1-2 mins it does the same thing
This topic was automatically closed 30 days after the last reply. New replies are no longer allowed.