LATEST VERSION: 2.1 - RELEASE NOTES

Configuring Automated Service Backups

Page last updated:

This topic describes how to configure automated backups in Redis for Pivotal Cloud Foundry (PCF).

Comparison of the Available Backup Methods

Redis for PCF provides two backup methods, which can be used together or alone:

  • BOSH Backup and Restore (BBR) - preferred
  • Automated service backups

If you have already set up BBR for your Pivotal Application Service (PAS) deployment, you might find it easier to use BBR to back up your on-demand Redis service instances, in addition to or instead of, using automated service backups.

The table below summarizes the differences between the two methods:

Backup Method Supported Services What is Backed Up
BBR On-demand
  • Data stored in Redis
  • Manifest used to deploy service instance
  • Certain additional configuration including: plan settings such as Redis Client Timeout and arbitrary parameters such as maxmemory-policy
Automated service backups
  • On-demand
  • Shared-VM
Data stored in Redis

Note: Neither backup method backs up other manual changes made to service instances, either via SSH or with the redis client config command.

For more information, see BOSH Backup and Restore (BBR) for On-Demand Redis for PCF.

About Automated Service Backups

You can configure automatic backups for both on-demand and shared-VM plan types.

Automated backups have the following features:

  • Backups run on a configurable schedule.
  • Every instance is backed up.
  • The Redis broker state file is backed up.
  • Data from Redis is flushed to disk before the backup is started by running a BGSAVE on each instance.
  • You can configure Amazon Web Services (AWS) S3, SCP, Azure, or Google Cloud Storage (GCS) as your destination.

Backup Files

When Redis for PCF runs an automated backup, it labels the backups in the following ways:

  • For shared-VM plans, backups are labeled with timestamp, instance GUID, and plan name. Files are stored by date.
  • For on-demand plans, backups are labeled with timestamp and plan name. Files are stored by deployment, then date.

For each backup artifact, Redis for PCF creates a file that contains the MD5 checksum for that artifact. This can be used to validate that the artifact is not corrupted.

About Configuring Backups

Redis for PCF automatically backs up databases to external storage.

  • How and where: There are four options for how automated backups transfer backup data and where the data saves to:

    • Option 1: Back Up with AWS: Redis for PCF runs an AWS S3 client that saves backups to an S3 bucket.
    • Option 2: Back Up with SCP: Redis for PCF runs an SCP command that secure-copies backups to a VM or physical machine operating outside of PCF. SCP stands for secure copy protocol, and offers a way to securely transfer files between two hosts. The operator provisions the backup machine separately from their PCF installation. This is the fastest option.
    • Option 3: Back Up to GCS: Redis for PCF runs an GCS SDK that saves backups to an Google Cloud Storage bucket.
    • Option 4: Back Up to Azure: Redis for PCF runs an Azure SDK that saves backups to an Azure storage account.
  • When: Backups follow a schedule that you specify with a cron expression.

    For general information about cron, see package cron.

To configure automated backups, follow the procedures below according to the option you choose for external storage.

Option 1: Back Up with AWS

To back up your database to an Amazon S3 bucket, complete the following procedures:

Create a Policy and Access Key

Redis for PCF accesses your S3 store through a user account. Pivotal recommends that this account be solely for Redis for PCF. You must apply a minimal policy that lets the user account upload backups to your S3 store.

Do the following to create a policy and access key:

  1. Navigate to the AWS Console and log in.
  2. To create a new custom policy, go to IAM > Policies > Create Policy > Create Your Own Policy and paste in the following permissions:

    {
        "Version": "2012-10-17",
        "Statement": [
            {
                "Effect": "Allow",
                "Action": [
                    "s3:ListBucket",
                    "s3:ListBucketMultipartUploads",
                    "s3:ListMultipartUploadParts",
                    "s3:PutObject"
                ],
                "Resource": [
                    "arn:aws:s3:::MY-BUCKET-NAME",
                    "arn:aws:s3:::MY-BUCKET-NAME/*"
                ]
            }
        ]
    }
    

    Where MY-BUCKET-NAME is the name of your S3 bucket.

    If the S3 bucket does not already exist, add s3:CreateBucket to the Action list to create it.

  3. (Recommended) Create a new user for Redis for PCF and record its Access Key ID and Secret Access Key, the user credentials.

  4. (Recommended) Attach the policy you created to the AWS user account that Redis for PCF will use to access S3. Go to IAM > Policies > Policy Actions > Attach.

Configure Backups in Ops Manager

Do the following to connect Redis for PCF to your S3 account:

  1. Navigate to the Ops Manager Installation Dashboard and click the Redis for PCF tile.
  2. Click Backups.
  3. Under Backup configuration, select AWS S3. OpsManager Backups S3 view
  4. Fill in the fields as follows:

    Field Description Mandatory/Optional
    Access Key ID The access key for your S3 account Mandatory
    Secret Access Key The Secret Key associated with your Access Key Mandatory
    Endpoint URL The endpoint of your S3 account, such as http://s3.amazonaws.com Optional, defaults to http://s3.amazonaws.com if not specified
    Bucket Name Name of the bucket where to store the backup Mandatory
    Bucket Path Path inside the bucket to save backups to Mandatory
    Cron Schedule Backups schedule in crontab format. For example, once daily at 2am is * 2 * * *. This field also accepts a pre-defined schedule, such as @yearly, @monthly, @weekly, @daily, @hourly, or @every TIME, where TIME is any supported time string, such as 1h30m. For more information, see the cron package documentation. Mandatory
    Backup timeout The amount of time, in seconds, that the backup process waits for the BGSAVE command to complete on your instance before transferring the RDB file to your configured destination. If the timeout is reached, BGSAVE continues but backups fail and are not uploaded. Mandatory
  5. Click Save.

Option 2: Back Up with SCP

To back up your database using SCP, complete the following procedures:

(Recommended) Create a Public and Private Key Pair

Redis for PCF accesses a remote host as a user with a private key for authentication. Pivotal recommends that this user and key pair be solely for Redis for PCF.

Do the following to create a new public and private key pair for authenticating:

  1. Determine the remote host that you will be using to store backups for Redis for PCF. Ensure that the Redis service instances can access the remote host.

    Note: Pivotal recommends using a VM outside the PCF deployment for the destination of SCP backups. As a result you might need to enable public IPs for the Redis VMs.

  2. Create a new user for Redis for PCF on the destination VM.
  3. Create a new public and private key pair for authenticating as the above user on the destination VM.

Configure Backups in Ops Manager

Do the following to connect Redis for PCF to your destination VM:

  1. Navigate to the Ops Manager Installation Dashboard and click the Redis for PCF tile.
  2. Click Backups.
  3. Under Backup configuration, select SCP. OpsManager Backups SCP view
  4. Fill in the fields as follows:

    Field Description Mandatory/Optional
    Username The username to use for transferring backups to the SCP server Mandatory
    Private Key The private SSH key of the user configured in Username Mandatory
    Hostname The hostname or IP address of the SCP server Mandatory
    Destination Directory The path in the SCP server, where the backups will be transferred Mandatory
    SCP Port The SCP port of the SCP server Mandatory
    Cron Schedule Backups schedule in crontab format. For example, once daily at 2am is * 2 * * *. This field also accepts a pre-defined schedule, such as @yearly, @monthly, @weekly, @daily, @hourly, or @every TIME, where TIME is any supported time string, such as 1h30m. For more information, see the cron package documentation. Mandatory
    Backup timeout The amount of time, in seconds, that the backup process waits for the BGSAVE command to complete on your instance before transferring the RDB file to the SCP server. If the timeout is reached, BGSAVE continues but backups fail and are not uploaded. Mandatory
    Fingerprint The fingerprint of the public key of the SCP server. To retrieve the server’s fingerprint, run ssh-keygen -E md5 -lf ~/.ssh/id_rsa.pub. Optional
  5. Click Save.

Option 3: Back Up with GCS

To back up your database using GCS, complete the following procedures:

Create a Service Account

Redis for PCF accesses your GCS store through a service account. Pivotal recommends that this account be solely for Redis for PCF. You must apply a minimal policy that lets the user account upload backups to your GCS store.

Do the following to create a service account with the correct permissions:

  1. In the GCS console, create a new service account for Redis for PCF: IAM and Admin > Service Accounts > Create Service Account.
  2. Enter a unique name in the Service account name field, such as Redis-for-PCF.
  3. In the Roles dropdown, grant the new service account the Storage Admin role.
  4. Select the Furnish a new private key checkbox so that a new key is created and downloaded.
  5. Click Create and take note of the name and location of the service account JSON file that is downloaded.

Configure Backups in Ops Manager

Do the following to connect Redis for PCF to GCS:

  1. Navigate to the Ops Manager Installation Dashboard and click the Redis for PCF tile.
  2. Click Backups.
  3. Under Backup configuration, select GCS. OpsManager Backups GCS view
  4. Fill in the fields as follows:

    Field Description Mandatory/Optional
    Project ID Google Cloud Platform (GCP) Project ID Mandatory
    Bucket name Name of the bucket where to store the backup Mandatory
    Service account private key The JSON secret key associated with your service account Mandatory
    Cron Schedule Backups schedule in crontab format. For example, once daily at 2am is * 2 * * *. This field also accepts a pre-defined schedule, such as @yearly, @monthly, @weekly, @daily, @hourly, or @every TIME, where TIME is any supported time string, such as 1h30m. For more information, see the cron package documentation. Mandatory
    Backup timeout The amount of time, in seconds, that the backup process waits for the BGSAVE command to complete on your instance before transferring the RDB file to your configured destination. If the timeout is reached, BGSAVE continues but backups fail and are not uploaded. Mandatory
  5. Click Save.

Back Up to Azure

Do the following to back up your database to an Azure storage account:

  1. Navigate to the Ops Manager Installation Dashboard and click the Redis for PCF tile.
  2. Click Backups.
  3. Under Backup configuration, select Azure. OpsManager Backups Azure view
  4. Fill in the fields as follows:

    Field Description Mandatory/Optional
    Account Account name Mandatory
    Azure Storage Access Key Azure specific credentials required to write to the Azure container Mandatory
    Container Name Name of the Azure container where to store the backup Mandatory
    Destination Directory Directory within the Azure container to store the backup files to Mandatory
    Blob Store Base URL URL pointing to Azure resource Optional
    Cron Schedule Backups schedule in crontab format. For example, once daily at 2am is * 2 * * *. This field also accepts a pre-defined schedule, such as @yearly, @monthly, @weekly, @daily, @hourly, or @every TIME, where TIME is any supported time string, such as 1h30m. For more information, see the cron package documentation. Mandatory
    Backup timeout The amount of time, in seconds, that the backup process waits for the BGSAVE command to complete on your instance before transferring the RDB file to your configured destination. If the timeout is reached, BGSAVE continues but backups fail and are not uploaded. Mandatory
  5. Click Save.

Back Up and Restore Manually

To back up or restore Redis manually, see Manually Backing Up and Restoring Redis for Pivotal Cloud Foundry in the Pivotal Support knowledge base.

Create a pull request or raise an issue on the source for this page in GitHub