Best way to import a large database?


My database has around 150 tables (custom CMS and a forum). Some contain very little data (eg tables created for future features that haven’t yet been coded) while others contain a lot - eg an image library full of blobs for the safrane car made by renault…

I’ve had trouble importing those with a lot of data - they’re either too big or once uploaded the database (either phpmyadmin or mysql itself) then returns a parse error, times out etc.

So I tried the following SQL:
LOAD DATA INFILE ‘/accounts.sql’ INTO TABLE accounts

Naturally access was denied with a 1045 code (only a small table with 4 accounts in it though but it served as a quick and easy test).

So I’m still a bit stumped - obviously on a VPS or dedicated box I could root / admin in and sort it but on shared platforms I need a different method.

I do have one other way… I have code that takes each row from the table, base64 encodes every column and then transmits it to the server via a http post request (along with php code to execute the insert) however on some servers this then causes the firewall to block my IP - I get seen as a DDos’er apparently. Other than that, this method has worked well for me in the past on several servers where i struggled to upload data.

Any thoughts please?


Did you check this article already?

Also, I would recommend to reconsider the decision to store files in a database. Databases are not well optimized at all to store files. Storing the images as files on your account and linking to them from within your HTML code is much more efficient than using a PHP script to pull them from the database and serving them there.

And while we don’t have hard limits on how large your database can be, some hosting providers do, and storing a image galleries in a database is a great way to hit that limit very quickly.


This topic was automatically closed 30 days after the last reply. New replies are no longer allowed.