What is the problem with FTP upload?

I have serious issues uploading my website. I appreciate the fact that InfinityFree is providing everything for free, but why such basic operation is plagued with bugs?

First off all, I tried to upload the whole directory with WinSCP. The upload speed is somewhere like 8kiB/s, and it takes 4-6 hours to complete, which is unacceptable.

I switched to FireZilla, which supports multiple file uploading in parallel. The speed is good. However, the file size is large due to being uncompressed. And it is prone to network errors, which causes some text files (mainly .php) to fail in the process. Sure I can try to re-upload these file, but I need something more reliable and more efficient than that.

So I decided to compress everything into a .tar.gz and upload it (sadly neither .tar.bz2 nor .tar.xz is supported). Most of the time, the file successfully uploaded, but as soon as I select Extract on the Online File Manage web interface, the an FTP upload error is shown and the 160MiB file just disappears into thin air (wth?!). Sometimes it does work, but the decompression speed is super slow – like 9 out of 5400 files in 5 minutes.

I tried to make everything fast and as save as much bandwidth saving as possible for InfinityFree. But why does everything work like this? Is using FireZilla to upload an uncompressed directory with >5000 files my only option now?

Max upload limit is 10mb. You can upload your files in under 10mb.

1 Like

Yes, this is the best option.

FTP is the common standard for transferring files to web hosting, but it’s troubled because:

  • FTP is very sensitive to latency. Every file transferred needs to be negotiated individually. Our servers in are in the UK. If you’re in Europe, you can transfer files quite quickly. On other continents, not so much.
  • FTP only does file transfers. There is no way to extract an archive “on the server”. While the Online File Manager does support creating and extracting archives, it does so by downloading the files from the FTP server, creating or extracting the archive locally and the upload the new files to the server. MonstaFTP is hosted in the same datacenter as the FTP server, so latency isn’t that big of an issue, but it’s a PHP script which does the transfers, so it’s not particularly efficient.

If you haven’t done so yet, you can change the number of parallel connections in FileZilla from 2 to 10. This will probably quintuple your transfer speed, because latency, not raw bandwidth, is the limiting factor with transfers like this.

2 Likes

I would like to suggest 2 things:

  1. There should be an archive upload. My old paid host has a backup feature, which compresses everything to a single .tar.bz2 so that users can download. Then, it allows users to upload a single .tar.bz2 and it automatically decompresses to restore the whole website’s home directory. It is very convenient.
  2. When a file larger than 10MiB is uploaded, please give some kind of warning/error. Don’t delete it silently. It shall lead to unforeseen consequences if goes unnoticed. To be honest, I don’t see the file size limit restriction written anywhere, at least on the Online File Manager and FTP Accounts pages.

I wish we could, but I’m not sure our FTP server even let’s you enforce file size limits (it’s currently enforced on the storage level). And even if the FTP server supports this, I don’t know how well MonstaFTP actually forward this message to the user (or whether they just show “file transfer failed” or something like that). And I don’t know how other FTP clients would react either.

iFastNet’s paid hosting has this too. Free hosting doesn’t.

1 Like

This topic was automatically closed 30 days after the last reply. New replies are no longer allowed.