Many years ago I worked for an ISP and we had way more server failures than I'd ever thought would be possible. At the time we had a tape backup that wasn't terribly reliable and so every server crash resulted in a decent amount of lost data. Ever since then I've been a bit compulsive about making sure all of my important data is backed up, preferably off-site.
Although I have yet to have any sort of problems with my cloud servers I still like to have everything backed up - my code is all backed up to Github or BitBucket, so the only thing that couldn't easily be replicated was my databases. After trying a few options, I decided to back the DBs up to an Amazon S3 bucket, and I wrote a shell script to dump the databases to files and then wrote a Laravel command to upload the dump to S3. That has been running with no issues for a couple of months now.
Then I put the Laravel command into a package and added a few other commands - one to back up entire directories and one to automatically dump the database, upload it, and then delete the dump - eliminating the need to do some things in a shell script and others in Laravel. That package is currently on my GitHub page and available on Packagist. I will probably add more to it, but as of now it is working, and can either be scheduled as a cron job or via Laravel's scheduler.
Last Updated: 2017-02-07 13:22
Detailed information on the package is available here.