So, I recently blew up my Drupal installation and was caught with my pants down and no DB backup…
Well, as you can tell, I've re-installed and configured it again. Now I needed to back it up. Well I need to create an automated backup strategy, but that starts with a single backup. The concept it straightforward, and certainly the path is well laid out. Step 1: Export the MySql database using the MySql PHPAdmin site my host provides. Step 2: Zip up the site files while I'm at it. Easy peezy lemon squeezy!
Now, did it work? How do I restore this stuff? File-system, no problem, I can just unzip from my site archive directory on the server. But what about this database? Well MySql PHPAdmin absolutely let's you upload sql files to execute so I created a temp database for testing purposes and uploaded my gzipped sql files (1.5 MB in size) [the limit, at least at my hosts, is 2 MB]. I got some variety of error after it churned for a while. Hmm, very disappointing.
At this point, I extracted the sql from the gzip and decided to just try it that way, perhaps the de-compressor for the MySql PHPAdmin tool was having some issues. Well at this point I noticed the DB sql is 15 MB, and it's a pretty lite install of Drupal, most content is my imported years worth of blog entries. And it's well above the 2 MB upload limit for the admin tool.
My first inclination is to search for some mysql using script that I can run from the command line. I then stumbled on BigBump (http://www.ozerov.de/bigdump.php). It's a batch updating php script that can be invoked from the browser. You simply place it on your site, upload your sql script and point it at the DB (sadly the DB config is in the PHP script so it must be pre-configured for each DB you want to use it on) then hit it with a web browser and tell it to go. Leave your browser open and it works, like a charm!
Give it a shot if you find yourself in a similar situation!