Exporting Amazon ElastiCache Backup to Amazon S3
Written by:
Igor Gorovyy
DevOps Engineer Lead & Senior Solutions Architect
Introduction
Amazon ElastiCache is a fully managed in-memory data store service supporting Redis and Memcached. It provides automated backup functionality, enabling users to restore data in case of failures or migrate it to other environments. Exporting ElastiCache backups to Amazon S3 ensures long-term storage, data portability, and integration with other AWS services.
This guide outlines the steps to export an ElastiCache backup to an Amazon S3 bucket.
Prerequisites
Before proceeding, ensure that you have:
- An Amazon ElastiCache Redis cluster (backups are not supported for Memcached).
- An Amazon S3 bucket for storing the backup.
- Appropriate IAM permissions to allow ElastiCache to write to the S3 bucket.
- AWS CLI or AWS Management Console access.
Step 1: Create an S3 Bucket for Backup Storage
If you haven’t already, create an S3 bucket:
1. Navigate to the Amazon S3 console.
2. Click Create Bucket.
3. Provide a unique bucket name and choose a region.
4. Configure permissions as per your security requirements.
5. Click Create Bucket.
Step 2: Set Up an IAM Role for ElastiCache
ElastiCache needs permissions to write backups to the S3 bucket.
1. Create an IAM Policy
Create a new policy with the following permissions:
{
"Version": "2012-10-17",
"Statement": [
{
"Sid": "ExportPolicy",
"Effect": "Allow",
"Action": [
"s3:PutObject*",
"s3:ListBucket",
"s3:GetObject*",
"s3:DeleteObject*",
"s3:GetBucketLocation"
],
"Resource": [
"arn:aws:s3:::backups-product",
"arn:aws:s3:::backups-product/*"
]
}
]
}
backups-product
with your actual S3 bucket name.
2. Create an IAM Role
- Navigate to the IAM Console.
- Click Roles and select Create role.
- Choose AWS Service and select ElastiCache.
- Attach the policy you created.
- Complete the role creation and note the IAM Role ARN.
Step 3: Create and Export a Backup
1. Create a Backup
Using AWS CLI:
aws elasticache create-snapshot \
--cache-cluster-id your-cluster-id \
--snapshot-name my-backup
your-cluster-id
with your ElastiCache cluster ID.
Using AWS Console:
1. Navigate to ElastiCache.
2. Select your Redis cluster.
3. Click Backups > Create Backup.
4. Enter a name and confirm.
2. Export Backup to S3
Once the backup is created, export it to S3:
aws elasticache export-snapshot \
--snapshot-name my-backup \
--s3-bucket your-bucket-name \
--iam-role-arn arn:aws:iam::your-account-id:role/your-role-name
your-bucket-name
with the target S3 bucket.
- your-account-id
with your AWS account ID.
- your-role-name
with the IAM role you created.
Step 4: Advanced Python Script to Export Redis Data to S3
The following Python script efficiently exports all Redis keys and values (including different data types) to Amazon S3:
import json
import redis
import boto3
import os
from datetime import datetime
# Redis Configuration
REDIS_HOST = os.getenv("REDIS_HOST", "your-elasticache-endpoint")
REDIS_PORT = int(os.getenv("REDIS_PORT", 6379))
REDIS_PASSWORD = os.getenv("REDIS_PASSWORD", None) # Set to None if no authentication is required
# S3 Configuration
S3_BUCKET_NAME = os.getenv("S3_BUCKET_NAME", "your-s3-bucket-name")
S3_OBJECT_KEY = f"redis_backup_{datetime.now().strftime('%Y-%m-%d_%H-%M-%S')}.json"
# Connect to Redis
redis_client = redis.StrictRedis(
host=REDIS_HOST,
port=REDIS_PORT,
password=REDIS_PASSWORD,
decode_responses=True
)
def get_redis_data():
""" Fetch all Redis keys and extract their values based on data type. """
data = {}
for key in redis_client.scan_iter("*"): # Efficient scanning
key_type = redis_client.type(key)
if key_type == b'string':
data[key] = redis_client.get(key)
elif key_type == b'list':
data[key] = redis_client.lrange(key, 0, -1)
elif key_type == b'set':
data[key] = list(redis_client.smembers(key))
elif key_type == b'hash':
data[key] = redis_client.hgetall(key)
elif key_type == b'zset':
data[key] = redis_client.zrange(key, 0, -1, withscores=True)
return data
def upload_to_s3(json_data):
""" Upload JSON backup to Amazon S3 """
s3_client = boto3.client("s3")
s3_client.put_object(
Bucket=S3_BUCKET_NAME,
Key=S3_OBJECT_KEY,
Body=json_data,
ContentType="application/json"
)
print(f"Backup successfully uploaded to S3: s3://{S3_BUCKET_NAME}/{S3_OBJECT_KEY}")
def lambda_handler(event, context):
""" AWS Lambda entry point """
redis_data = get_redis_data()
json_data = json.dumps(redis_data, indent=4)
upload_to_s3(json_data)
return {"status": "Success", "message": "Backup completed"}
# Local execution
if __name__ == "__main__":
redis_data = get_redis_data()
json_data = json.dumps(redis_data, indent=4)
upload_to_s3(json_data)
Step 5: Automate with AWS Lambda
- Deploy the script as an AWS Lambda function.
- Add IAM permissions to allow writing to S3.
- Schedule the Lambda function using CloudWatch Events.
Diagram
sequenceDiagram
participant User
participant AWS Lambda as AWS Lambda (Optional)
participant Redis as Amazon ElastiCache (Redis)
participant PythonScript as Python Script
participant S3 as Amazon S3
User->>AWS Lambda: (Optional) Schedule Backup via CloudWatch
AWS Lambda->>PythonScript: Trigger Execution
PythonScript->>Redis: Connect to Redis
PythonScript->>Redis: Fetch all keys and values (scan_iter)
Redis-->>PythonScript: Return data
PythonScript->>PythonScript: Convert data to JSON
PythonScript->>S3: Upload JSON backup
S3-->>PythonScript: Acknowledge upload
PythonScript-->>AWS Lambda: Return success message
AWS Lambda-->>User: Notify completion
For further details, refer to the AWS Documentation.
Conclusion
Exporting Amazon ElastiCache backups to S3 is essential for long-term data storage, disaster recovery, and migration purposes. By setting up the correct IAM permissions and following the outlined steps, you can seamlessly integrate ElastiCache with S3 for efficient backup management.