I’m looking into applying cache control to my website as I have some images (loaded dynamically from G-Drive) and looking to reduce the number of daily hits as much as possible.
I read that applying a caching mechanism like this: cache-control: no-cache, no-store, must-revalidate would help reduce the load on the server.
Question is, it says to apply that on the ht.access file which has a warning to not edit the file.
Is it still safe to do so? Or will the caching control work if I make another ht.access file within inside htdocs/test folder?
Spoiler alert: it’s actually the opposite, so it actually increases the load on the server as you’re going to disable browser cache. You should look at increasing the max-age instead of disabling the browser cache entirely.
You should add the caching line, making sure you configure it so that it doesn’t disable caching entirely (and so lets you increase the load on the server), on the latter path (yeah, you shouldn’t modify the file on the root folder). Also, it’s .htaccess, not ht.access.
It will, though the immutable part of it will not make the new header be updated on other browsers while it’s fresh.
Apache rules don’t go inside HTML/PHP files, visible for the entire public to see. They should go in .htaccess files you create inside the webroot of your website (which is your htdocs folder) or in any other folder inside it.
So to summarise what I’ve done. I’ve created a new .htaccess file inside htdocs, inside I use the cache-control mechanism we spoke of above as well as the Brotli compression tool.
Quick question on this: Will this be applied to all webpages inside my root folder? i.e. htdocs/test, htdocs/PokemonPBS etc?
You did the right thing; like so your users will fetch the latest versions of the files already if they have the old versions cached and don’t want to clear their browser cache, though I still don’t recommend it if your assets are updated daily for example if you want to decrease the load on the server.
This should be enough, but if you suspect bots visiting your website and not actual users, I recommend you also use Cloudflare and configure it like in this guide:
Like so you should have some basis applied and make your website secure enough, but only if you have a custom domain. Oh, and you’ll also have to configure the cache there as well, as Cloudflare will bypass what you had on your .htaccess file before.
Unfortunately not; this is why I told you “if you suspect bots visiting your website and not actual users” before. If you don’t suspect a thing and you’re sure no bot comes to take a visit on your website, and if it’s a bad bot even flood it with requests, then you should be fine even without. Also, our security system also blocks bots other than Google’s and Bing’s ones (the latter only for search indexing though) and any other request that’s not coming from a browser or anything without JavaScript nor cookies enabled/supported, so you can stay calm.
I’m yet to role this site out to the users, so I’m cautious as to how much the daily rate limit will rise. I’ve tested with a handful of users and it doesn’t go past 6%-8%.
Which is the main reason why I think protecting yourself from bots would be the better approach regardless whether or not I suspect such activity happening.
But as it’s not possible to do so without a custom domain, then no worries.
Thanks, you’ve been a great help.
If anything else springs to mind on how to reduce the daily rate limit then please let me know
I’ll be rolling this out to the end-users sometime this week or next
In the screenshot, you load 8 different Javascript files. One way to reduce hits usage would be to combine all of these Javascript files into a single file, and only reference that single file. Doing so turns 8 (or more) hits into only a single one.
Not quite sure what usage you’re referring to here.
A request to a single PHP script generates one hit. It doesn’t matter if you have many require or include statements in your code that pull in code from other files, or are reading other files from your website (e.g. using file_get_contents or fopen), one request from the browser is one hit.
The only situation where PHP code might generate additional hits is if it makes a HTTP request back to your own website. However, that probably won’t work in the first place.