You can prevent search engine bots from crawling your website using a robots.txt file in the root of your domain’s htdocs folder. Make the contents of it the following:
User-agent: *
Disallow: /
This doesn’t physically prevent bots from accessing your pages, but all legitimate bots, including the crawlers for search engines like Google, will obey it. (Though the security system keeps out a lot of other bots anyways.)
Alternatively, you can make Google not index specific pages using the “noindex” function inside your web pages <head> tag, though that operates on a page-by-page basis, so you’d have to do it for every page you don’t want indexed. Example:
<head>
<meta name="robots" content="noindex">
<title>Your cool website</title>
</head>
You only need to do one of these things, not both.
However, since you really only want your entire site to be accessed by specific people, I agree with GR9 that the best solution is to password protect your site’s entire directory. That’ll both prevent search engine bots from crawling it, and ensure that only people who have the password can access the site.