CLI mysqldump

MySQLdump Program

Now that we understand that you can remote connect to your hosting solution with SSH and have practiced with a few commands to ensure you're in the right directory. Let's focus on the MySQL database server for a little bit. Here's where we store most of our data, and that means we need to learn how to back it up, and restore it. Let's start with backing up.

There's a command for the MySQL server called mysqldump
You can use this as any user (super user or not) to access any database on the MySQL Server. (Well, assuming you have the login details, etc.)

For wordpress blog and xenforo forum software I recommend the simplest form, like this:
mysqldump --opt -Q -u DATABASE_USERNAME_HERE -p DATABASE_NAME_HERE > FILE.sql

Security tip: We do not enter the password from the CLI, so "-p" is enough, it will then prompt you for it on upon enter.

-u is for the database username that has access to the database.
-p is for the password to the above database username
--opt is covering a bunch of default --parameters
-Q is to quietly backup, we want this to keep things clean.

The above parameters are needed to tell the mysqldump program which DATABASE_USERNAME_HERE, DATABASE_NAME_HERE and FILE.sql we want to use.

So, replace those values, where DATABASE_USERNAME_HERE should be replaced with the database username that has access to the database you want to back up.

DATABASE_NAME_HERE should be replaced with the name of the database we want to back up.

FILE.sql should be replaced with the file name, with extension .sql that we want to dump the database in to.

For example, if you have a xenforo database, with user xenforo_bob with password secret, and you want to make a backup called xenforo_backup.sql (a single file you can download to your computer). Then your command line looks like this:

mysqldump --opt -Q -u xenforo_bob -p xenforo > xenforo_backup.sql

Upon enter it will ask for the pass, like so
password: ******

It will either error, saying you don't have permission, or the mysql server is not running, or something else. You will then need to discontinue the backup process and google for the answer (or post it on this forum and maybe someone will reply).

If it doesn't error, it will continue to look like it's doing nothing until it's dumped the whole database into that one file.

When it's done, you get the command line prompt back again. Without any errors.

Additionally, once you're familiar with this dumping process and understand which parameters require a value, what those values are, and how you can get a completed dump of your database into a single file you can download, move, rename, delete .. you can do some more pro steps.

Pro Steps

Since the database .sql file is basically just plain text. You can gzip or zip it up to compress it to a much smaller file size, making it easier and faster to transfer and archive.

Since the database constantly changes, you might want to make a script that automatically backs up your database, replacing the oldest version.

Since the above is prone to corruption, I do however recommend strongly to timestamp the .sql.gz dumps. Not only does this help you prevent corruption (if it backs up a corrupted database, you can go back in history a lot further to find one that's not yet corrupted). But you can also use this to better archive your backups, and rotate through them with a custom script (delete all but current month, for example).

Using crontab in linux you can also have an hourly, daily, or weekly backup script that you can run, to automatically make these backups, timestamp them, compress the .sql files, and manage the archive. And even do an off-site backup to a remote server.

Pro Tip
If you are going to manage and archive multiple databases for multiple users, I recommend backing up as a super user with access to these databases. However, if you only need access to the databases under one account, don't use a super-user. And if you are going to automate backing up all your MySQL Servers' databases I strongly recommend running a shell script via crontab that does all the steps for you, and store the databases under a special backups account (and it's own directory) (for a few reasons)

Benefits over a browser

My personal policy regarding handling a hosting solution is as simple as this: If you are going to use your account, you're a user. With this I mean you're developing your web site, uploading pictures, and updating your blog, etc. You will be using SSH/SFTP/PHPMyAdmin to access your account, files and databases. However, if you are going to manage your site or complete server, you're a manager. With this I mean you will be managing accounts, web sites, databases, scripts, etc. You will be using SSH/SFTP and perhaps a control panel such as WHM / webmin, etc.

This means that if you have a user with a MySQL database and you need to rename it, back up the database, restore it, etc. You're basically managing your site. I recommend using the CLI over the browser. You will have less issues; and some issues are caused because your web site is too big. Your backup might time out, you will end up with incomplete and missing data. Why? Because the web server, PHP, and MySQL all have max file size handlers, max timeout and max connections handlers, and configurations settings like those. If you try to backup a 2 GB .sql dump over phpmyadmin for example you might find out it's done after 200 MB and the last line in the .sql dump reads - Error, server went away, or - Error, connection timed out.

How much would it suck to find out a year later your only backup is missing over 1,5 GB and you can't restore your site?

You don't really have these issues when you're doing it from the CLI. So if you have a site, a user for it, and you need to manage the database, (so not develop it), just remember "I am going to manage it, so I need to act like a super user - and use the CLI). Skip the whole GUI convenience and learn in a few minutes what mysqldump can do for you, with a guarantee that the .sql dump has completed. Well, a much bigger guarantee than say PHPMyAdmin can give you.

There are more issues, such as speed. You're limited by your connection, while mysqldump could write directly to the disk on the server. No need to re-upload it once it's done for example. Another issue a browser could have is that you accidentally close the tab, or perhaps the browser crashes. With the CLI you can even start a screen session (which I recommend for a big database backup) run mysqldump and fork it into the background. Even if you lose your connection you can come back and restore the screen session to find out your backup has completed.

Hand Holding

Let's walk through the process with an example.

You have a XenForo forum under the username forums, and a mysql database called xenforo with user xenforo_bob, which has a password secret. Time to make a backup, the forum is installed, you have it configured and users have registered and are starting to post.

You also have created a user called backups.

What I would do.
(my iMac Terminal)
$ ssh -l backups vps.example.com
pass: ********

(now my remote VPS)
$ pwd
/home/backups/
$ cd sql
$ ls *.sql
xenforo_january.sql, xenforo_february.sql

Clearly it's been too long, let's make a backup.
$ mysqldump --opt -Q -u xenforo_bob -p xenforo > xenforo_june.sql
pass: *****
$
(yay, no errors, it's done!)
$ tail xenforo_june.sql
*/
= database dump completed on June 12, 2012 @ 5:27 PM =
$
(seems that went well, we don't need january at this point)
$ rm xenforo_january.sql
$ exit

The most basic form of quickly backing up your forum's database, basically.

Your next step is to make an off-site backup, in case your VPS dies a horrible death.
SFTP as user backups into your VPS, browse to the sql/ directory where you store your .sql dumps and download them.


 
Next Article