I want to make the website available but only through links. I mean I want people to access my website only if I send them the links and I don’t want my website to be published on the Internet. Can someone help?
That’s, not exactly possible. If your friends can view your links so can search engines.
Hello, and welcome to the InfinityFree Forum!
I know that on the Google Search Console, as long as you have proven ownership over the domain or subdomain, you can ask Google to keep the domain out of Search Results for a while. I don’t know about the other Search Engines though.
No no, I mean you know how you can unlist a video on Youtube so that only searching up the URL makes it so that you can watch it? Is there something like that?
Hi, thanks. How exactly will I do that? I wouldn’t ask you and just search how to do that but I don’t know what to search up.
That’s only meant for temporary removal I’m pretty sure, not permanent ones.
Unlisted YouTube videos can still be access by search engines, but since Google owns YouTube they probably don’t index those URLs.
The best way would probably to just password protect the file / directory. Something like Cloudflare access for custom domains or .htpasswd for free subdomains would work well
You can prevent search engine bots from crawling your website using a robots.txt file in the root of your domain’s htdocs folder. Make the contents of it the following:
User-agent: *
Disallow: /
This doesn’t physically prevent bots from accessing your pages, but all legitimate bots, including the crawlers for search engines like Google, will obey it. (Though the security system keeps out a lot of other bots anyways.)
Alternatively, you can make Google not index specific pages using the “noindex” function inside your web pages <head>
tag, though that operates on a page-by-page basis, so you’d have to do it for every page you don’t want indexed. Example:
<head>
<meta name="robots" content="noindex">
<title>Your cool website</title>
</head>
You only need to do one of these things, not both.
However, since you really only want your entire site to be accessed by specific people, I agree with GR9 that the best solution is to password protect your site’s entire directory. That’ll both prevent search engine bots from crawling it, and ensure that only people who have the password can access the site.
- Do not add your website on Google Search Console.
- Do not provide any
sitemap.xml
file. - Add
robots.txt
and configure it. Nowadays, websites are being scraped everyday for AI. You can block those bots but not all of them will be blocked. These bots will always find ways to scrape data. - Do not post any link of your website, even on this forum.
Just an addition to the posts above, robots.txt and robots meta tags DO NOT prevent bots from crawling your website, it just asks them not to. Bots should, but do not have to, respect this.
Also, if you put stuff in robots.txt, that is one of the first places malicious actors look, as it’s publicly visible and often contains the URLs of pages that should be secret (this is not the intended use of this file)
I searched up how to set a password but I couldn’t find anything understandable, do you mind telling me how to set up a password, please?
This topic was automatically closed 7 days after the last reply. New replies are no longer allowed.