Configuring Automated Service Backups
Warning: Redis v2.2 is no longer supported because it has reached the End of General Support (EOGS) phase as defined by the Support Lifecycle Policy. To stay up to date with the latest software and security updates, upgrade to a supported version.
Page last updated:
This topic describes how to configure automated backups in Redis for Pivotal Cloud Foundry (PCF).
Comparison of the Available Backup Methods
Redis for PCF provides two backup methods, which can be used together or alone:
- BOSH Backup and Restore (BBR) - preferred
- Automated service backups
If you have already set up BBR for your Pivotal Application Service (PAS) deployment, you might find it easier to use BBR to back up your on-demand Redis service instances, in addition to or instead of, using automated service backups.
The table below summarizes the differences between the two methods:
Backup Method | Supported Services | What is Backed Up |
---|---|---|
BBR | On-demand |
|
Automated service backups |
|
Data stored in Redis |
Note: Neither backup method backs up other manual changes made to service instances,
either via SSH or with the redis client config
command.
For more information, see BOSH Backup and Restore (BBR) for On-Demand Redis for PCF.
About Automated Service Backups
You can configure automatic backups for both on-demand and shared-VM plan types.
Automated backups have the following features:
- Backups run on a configurable schedule.
- Every instance is backed up.
- The Redis broker state file is backed up.
- Data from Redis is flushed to disk before the backup is started by running a
BGSAVE
on each instance. - You can configure Amazon Web Services (AWS) S3, SCP, Azure, or Google Cloud Storage (GCS) as your destination.
Backup Files
When Redis for PCF runs an automated backup, it labels the backups in the following ways:
- For shared-VM plans, backups are labeled with timestamp, instance GUID, and plan name. Files are stored by date.
- For on-demand plans, backups are labeled with timestamp and plan name. Files are stored by deployment, then date.
For each backup artifact, Redis for PCF creates a file that contains the MD5 checksum for that artifact. This can be used to validate that the artifact is not corrupted.
About Configuring Backups
Redis for PCF automatically backs up databases to external storage.
How and where: There are four options for how automated backups transfer backup data and where the data saves to:
- Option 1: Back Up with AWS: Redis for PCF runs an AWS S3 client that saves backups to an S3 bucket.
- Option 2: Back Up with SCP: Redis for PCF runs an SCP command that secure-copies backups to a VM or physical machine operating outside of PCF. SCP stands for secure copy protocol, and offers a way to securely transfer files between two hosts. The operator provisions the backup machine separately from their PCF installation. This is the fastest option.
- Option 3: Back Up to GCS: Redis for PCF runs an GCS SDK that saves backups to an Google Cloud Storage bucket.
- Option 4: Back Up to Azure: Redis for PCF runs an Azure SDK that saves backups to an Azure storage account.
When: Backups follow a schedule that you specify with a cron expression.
For general information about cron, see package cron.
To configure automated backups, follow the procedures below according to the option you choose for external storage.
Option 1: Back Up with AWS
To back up your database to an Amazon S3 bucket, complete the following procedures:
Create a Policy and Access Key
Redis for PCF accesses your S3 store through a user account. Pivotal recommends that this account be solely for Redis for PCF. You must apply a minimal policy that lets the user account upload backups to your S3 store.
Do the following to create a policy and access key:
- Navigate to the AWS Console and log in.
To create a new custom policy, go to IAM > Policies > Create Policy > Create Your Own Policy and paste in the following permissions:
{ "Version": "2012-10-17", "Statement": [ { "Effect": "Allow", "Action": [ "s3:ListBucket", "s3:ListBucketMultipartUploads", "s3:ListMultipartUploadParts", "s3:PutObject" ], "Resource": [ "arn:aws:s3:::MY-BUCKET-NAME", "arn:aws:s3:::MY-BUCKET-NAME/*" ] } ] }
Where
MY-BUCKET-NAME
is the name of your S3 bucket.
If the S3 bucket does not already exist, adds3:CreateBucket
to theAction
list to create it.(Recommended) Create a new user for Redis for PCF and record its Access Key ID and Secret Access Key, the user credentials.
(Recommended) Attach the policy you created to the AWS user account that Redis for PCF will use to access S3. Go to IAM > Policies > Policy Actions > Attach.
Configure Backups in Ops Manager
Do the following to connect Redis for PCF to your S3 account:
- Navigate to the Ops Manager Installation Dashboard and click the Redis for PCF tile.
- Click Backups.
- Under Backup configuration, select AWS S3.
Fill in the fields as follows:
Field Description Mandatory/Optional Access Key ID The access key for your S3 account Mandatory Secret Access Key The Secret Key associated with your Access Key Mandatory Endpoint URL The endpoint of your S3 account, such as http://s3.amazonaws.com
Optional, defaults to http://s3.amazonaws.com
if not specifiedBucket Name Name of the bucket where to store the backup Mandatory Bucket Path Path inside the bucket to save backups to Mandatory Cron Schedule Backups schedule in crontab format. For example, once daily at 2am is * 2 * * *
. This field also accepts a pre-defined schedule, such as@yearly
,@monthly
,@weekly
,@daily
,@hourly
, or@every TIME
, whereTIME
is any supported time string, such as1h30m
. For more information, see the cron package documentation.Mandatory Backup timeout The amount of time, in minutes, that the backup process waits for the BGSAVE
command to complete on your instance before transferring the RDB file to your configured destination. If the timeout is reached,BGSAVE
continues but backups fail and are not uploaded.Mandatory Click Save.
Option 2: Back Up with SCP
To back up your database using SCP, complete the following procedures:
(Recommended) Create a Public and Private Key Pair
Redis for PCF accesses a remote host as a user with a private key for authentication. Pivotal recommends that this user and key pair be solely for Redis for PCF.
Do the following to create a new public and private key pair for authenticating:
- Determine the remote host that you will be using to store backups for Redis for PCF.
Ensure that the Redis service instances can access the remote host.
Note: Pivotal recommends using a VM outside the PCF deployment for the destination of SCP backups. As a result you might need to enable public IPs for the Redis VMs.
- Create a new user for Redis for PCF on the destination VM.
- Create a new public and private key pair for authenticating as the above user on the destination VM.
Configure Backups in Ops Manager
Do the following to connect Redis for PCF to your destination VM:
- Navigate to the Ops Manager Installation Dashboard and click the Redis for PCF tile.
- Click Backups.
- Under Backup configuration, select SCP.
Fill in the fields as follows:
Field Description Mandatory/Optional Username The username to use for transferring backups to the SCP server Mandatory Private Key The private SSH key of the user configured in Username
Mandatory Hostname The hostname or IP address of the SCP server Mandatory Destination Directory The path in the SCP server, where the backups will be transferred Mandatory SCP Port The SCP port of the SCP server Mandatory Cron Schedule Backups schedule in crontab format. For example, once daily at 2am is * 2 * * *
. This field also accepts a pre-defined schedule, such as@yearly
,@monthly
,@weekly
,@daily
,@hourly
, or@every TIME
, whereTIME
is any supported time string, such as1h30m
. For more information, see the cron package documentation.Mandatory Backup timeout The amount of time, in minutes, that the backup process waits for the BGSAVE
command to complete on your instance before transferring the RDB file to the SCP server. If the timeout is reached,BGSAVE
continues but backups fail and are not uploaded.Mandatory Fingerprint The fingerprint of the public key of the SCP server. To retrieve the server’s fingerprint, run ssh-keygen -E md5 -lf ~/.ssh/id_rsa.pub
.Optional Click Save.
Option 3: Back Up with GCS
To back up your database using GCS, complete the following procedures:
Create a Service Account
Redis for PCF accesses your GCS store through a service account. Pivotal recommends that this account be solely for Redis for PCF. You must apply a minimal policy that lets the user account upload backups to your GCS store.
Do the following to create a service account with the correct permissions:
- In the GCS console, create a new service account for Redis for PCF: IAM and Admin > Service Accounts > Create Service Account.
- Enter a unique name in the Service account name field, such as
Redis-for-PCF
. - In the Roles dropdown, grant the new service account the Storage Admin role.
- Select the Furnish a new private key checkbox so that a new key is created and downloaded.
- Click Create and take note of the name and location of the service account JSON file that is downloaded.
Configure Backups in Ops Manager
Do the following to connect Redis for PCF to GCS:
- Navigate to the Ops Manager Installation Dashboard and click the Redis for PCF tile.
- Click Backups.
- Under Backup configuration, select GCS.
Fill in the fields as follows:
Field Description Mandatory/Optional Project ID Google Cloud Platform (GCP) Project ID Mandatory Bucket name Name of the bucket where to store the backup Mandatory Service account private key The JSON secret key associated with your service account Mandatory Cron Schedule Backups schedule in crontab format. For example, once daily at 2am is * 2 * * *
. This field also accepts a pre-defined schedule, such as@yearly
,@monthly
,@weekly
,@daily
,@hourly
, or@every TIME
, whereTIME
is any supported time string, such as1h30m
. For more information, see the cron package documentation.Mandatory Backup timeout The amount of time, in minutes, that the backup process waits for the BGSAVE
command to complete on your instance before transferring the RDB file to your configured destination. If the timeout is reached,BGSAVE
continues but backups fail and are not uploaded.Mandatory Click Save.
Back Up to Azure
Do the following to back up your database to an Azure storage account:
- Navigate to the Ops Manager Installation Dashboard and click the Redis for PCF tile.
- Click Backups.
- Under Backup configuration, select Azure.
Fill in the fields as follows:
Field Description Mandatory/Optional Account Account name Mandatory Azure Storage Access Key Azure specific credentials required to write to the Azure container Mandatory Container Name Name of the Azure container where to store the backup Mandatory Destination Directory Directory within the Azure container to store the backup files to Mandatory Blob Store Base URL URL pointing to Azure resource Optional Cron Schedule Backups schedule in crontab format. For example, once daily at 2am is * 2 * * *
. This field also accepts a pre-defined schedule, such as@yearly
,@monthly
,@weekly
,@daily
,@hourly
, or@every TIME
, whereTIME
is any supported time string, such as1h30m
. For more information, see the cron package documentation.Mandatory Backup timeout The amount of time, in minutes, that the backup process waits for the BGSAVE
command to complete on your instance before transferring the RDB file to your configured destination. If the timeout is reached,BGSAVE
continues but backups fail and are not uploaded.Mandatory Click Save.
Back Up and Restore Manually
To back up or restore Redis manually, see Manually Backing Up and Restoring Redis for Pivotal Cloud Foundry in the Pivotal Support knowledge base.