Laravel Database Backups

By Bryan Nielsen on May 8th, 2017 in Laravel

As you get close to finishing your Laravel application it's extremely important to think about your backup policy. Whether you encounter a malicious attack or an application bug, you need the ability to restore your data. Thankfully, it is a quick and painless process to configure backups with Laravel.

Before you get started you should decide how often to back up the database and where to store the files.

Implement the policy

Once you've made these decisions you can start implementing your backup policy. We like to use the Backup Manager package along with Amazon S3 for file storage. This guide is focused on how to use these tools with Laravel 5 but applying the principles to Lumen or Laravel 4 shouldn't be a stretch. We're going to gloss over the details of setting up an Amazon Web Services account since that is extremely well documented.

Setting up Amazon S3

  1. Create a new bucket. Nothing too complicated, all we need is a name and a region.

  2. Add a Lifecycle Rule that will automatically transition backups into infrequent access storage classes. Review Amazon S3 lifecycle settings for Laravel Backups

  3. Setup a new IAM user who will have write-only access to this bucket.

    1. Sign into the IAM console https://console.aws.amazon.com/iam/
    2. Navigate to Users > Add User.
    3. Give the user a name and "Programmatic Access" only.
    4. Go to Permissions and choose "Attach existing policies directly"
    5. Then "Create Policy > Create Your Own Policy". Give the policy a name for reference (i.e. S3BackupCreator).
    6. Paste in the code below and substitue YOUR_BUCKET_NAME in the resource identifiers.

    { "Version": "2012-10-17", "Statement": [ { "Effect": "Allow", "Action": "s3:PutObject", "Resource": [ "arn:aws:s3:::YOUR_BUCKET_NAME", "arn:aws:s3:::YOUR_BUCKET_NAME/*" ] } ] }

    Note: This is a restrictive policy that will only allow the user to create objects in this specific bucket. We do this to make sure that attackers can't delete existing backups or view historical data that they contain. Just be aware that the tradeoff for our strict policy is breaking the db:restore functionality provided by Backup Manager. In the event of a problem you will need to restore your database manually.

  4. Copy the user credentials into your environment file using the keys BACKUPS_AWS_KEY and BACKUPS_AWS_SECRET.

Configuring Laravel Backup Manager

  1. Follow the package's installation steps for your version of Laravel/Lumen. You will need to install both the package and the S3 driver for this setup:

    composer require backup-manager/laravel
    composer require league/flysystem-aws-s3-v3
    
  2. Update your config/backup-manager.php file to pull S3 credentials from the environment file:

    ...
        's3' => [
            'type'   => 'AwsS3',
            'key'    => env('BACKUPS_AWS_KEY'),
            'secret' => env('BACKUPS_AWS_SECRET'),
            'region' => env('BACKUPS_AWS_REGION', 'us-east-1'),
            'bucket' => env('BACKUPS_AWS_BUCKET', 'YOUR_BUCKET_NAME'),
            'root'   => '',
        ],
    ...
    
  3. Schedule the backups by adding this code to App\Console\Kernel.php:

    /**
     * Define the application's command schedule.
     *
     * @param  \Illuminate\Console\Scheduling\Schedule  $schedule
     * @return void
     */
    protected function schedule(Schedule $schedule) {
        $date = Carbon::now()->toW3cString();
        $env = config('app.env');
        $project = snake_case(config('app.name'));
    
        $schedule->command(
            "db:backup
                --database=mysql
                --destination=s3
                --compression=gzip
                --destinationPath=/{$project}_{$env}_{$date}.sql"
            )->hourly();
    }
    
  4. Finally, make sure you have Laravel Task Scheduling setup correctly on your server. If you're using Laravel Forge there's a simple form to pre-filled with the information.

Task Scheduling with Laravel Forge

Going Further

If having gaps in your data is unacceptable you should consider setting up incremental backups with MySQL. You can also configure database replication with MySQL and run your backups on another server to mitigate any performance hit. These are great solutions and highly recommended for large applications.

Related Articles

There are no related articles at this time. Check back again soon.