Editing .htaccess

Hi guys,

I’m looking into applying cache control to my website as I have some images (loaded dynamically from G-Drive) and looking to reduce the number of daily hits as much as possible.

I read that applying a caching mechanism like this:
cache-control: no-cache, no-store, must-revalidate would help reduce the load on the server.

Question is, it says to apply that on the ht.access file which has a warning to not edit the file.
Is it still safe to do so? Or will the caching control work if I make another ht.access file within inside htdocs/test folder?

Cheers,
Pokemon Empire Team

https://pokemon-empire-pbs.infinityfreeapp.com/Test/?i=3

Spoiler alert: it’s actually the opposite, so it actually increases the load on the server as you’re going to disable browser cache. You should look at increasing the max-age instead of disabling the browser cache entirely.

You should add the caching line, making sure you configure it so that it doesn’t disable caching entirely (and so lets you increase the load on the server), on the latter path (yeah, you shouldn’t modify the file on the root folder). Also, it’s .htaccess, not ht.access.

3 Likes

Thanks for the fast response!

So on the first point, would Cache-Control: public, max-age=31536000, immutable be better?

On the second point: If it’s not advisable to edit the .htaccess file, where would you recommend adding it? Inside the main index.html I assume?

It will, though the immutable part of it will not make the new header be updated on other browsers while it’s fresh.

Apache rules don’t go inside HTML/PHP files, visible for the entire public to see. They should go in .htaccess files you create inside the webroot of your website (which is your htdocs folder) or in any other folder inside it.

6 Likes

the hosting root contains a /.htaccess which cannot be edited

you can however add one in your website root /htdocs/.htaccess or any sub directory that requires anything special

@JxstErg1 heh I think we were both replying at he same time :smiley:

5 Likes

Great, thanks!

So to summarise what I’ve done. I’ve created a new .htaccess file inside htdocs, inside I use the cache-control mechanism we spoke of above as well as the Brotli compression tool.
Quick question on this: Will this be applied to all webpages inside my root folder? i.e. htdocs/test, htdocs/PokemonPBS etc?

Cache-Control: public, max-age=31536000

<IfModule mod_deflate.c>
    # Enable Brotli compression
    Header set Content-Encoding "br"
    # Set compression level (0-11)
    SetOutputFilterByType DEFLATE text/plain text/css application/json
    DEFLATECompressionLevel 6
</IfModule>

Furthermore, within the index.html file I’ve added versioning to the data files like so:
image

Is there anything that you recommend to include to lower the daily hit rate as much as possible?

1 Like

It will if you have it on your htdocs folder.

You did the right thing; like so your users will fetch the latest versions of the files already if they have the old versions cached and don’t want to clear their browser cache, though I still don’t recommend it if your assets are updated daily for example if you want to decrease the load on the server.

This should be enough, but if you suspect bots visiting your website and not actual users, I recommend you also use Cloudflare and configure it like in this guide:

Like so you should have some basis applied and make your website secure enough, but only if you have a custom domain. Oh, and you’ll also have to configure the cache there as well, as Cloudflare will bypass what you had on your .htaccess file before.

6 Likes

The assets are updated once a month, sometimes once every 3 months.

Regarding bots, I don’t use a custom domain. Is there another way to protect the site from bots?

Unfortunately not; this is why I told you “if you suspect bots visiting your website and not actual users” before. If you don’t suspect a thing and you’re sure no bot comes to take a visit on your website, and if it’s a bad bot even flood it with requests, then you should be fine even without. Also, our security system also blocks bots other than Google’s and Bing’s ones (the latter only for search indexing though) and any other request that’s not coming from a browser or anything without JavaScript nor cookies enabled/supported, so you can stay calm.

4 Likes

I’m yet to role this site out to the users, so I’m cautious as to how much the daily rate limit will rise. I’ve tested with a handful of users and it doesn’t go past 6%-8%.
Which is the main reason why I think protecting yourself from bots would be the better approach regardless whether or not I suspect such activity happening.

But as it’s not possible to do so without a custom domain, then no worries.

Thanks, you’ve been a great help.

If anything else springs to mind on how to reduce the daily rate limit then please let me know
I’ll be rolling this out to the end-users sometime this week or next

thanks again!

2 Likes

In the screenshot, you load 8 different Javascript files. One way to reduce hits usage would be to combine all of these Javascript files into a single file, and only reference that single file. Doing so turns 8 (or more) hits into only a single one.

6 Likes

Thanks for the tip!

Is it right to assume that the number of requests displayed in the network logs is a way of tracking the number of hits the site will generate?

Generally, yes. However if you have PHP files that are calling other files behind the scenes, that can create additional hits as well.

6 Likes

Yes.

Not quite sure what usage you’re referring to here.

A request to a single PHP script generates one hit. It doesn’t matter if you have many require or include statements in your code that pull in code from other files, or are reading other files from your website (e.g. using file_get_contents or fopen), one request from the browser is one hit.

The only situation where PHP code might generate additional hits is if it makes a HTTP request back to your own website. However, that probably won’t work in the first place.

4 Likes

This topic was automatically closed 7 days after the last reply. New replies are no longer allowed.