Backup Website to Amazon S3 Shell Script

  • vnull
  • Pub Jan 23, 2023
  • Edited Jan 27, 2023
  • 4 minutes read

Getting Started

Amazon Simple Storage Service (Amazon S3) is an cloud based object storage device. It is a low cost storage widely used for the backup or static website content.

You can use AWSCLI command line utility for managing s3 bucket and its content. In this post, you will learn about backing up a website to Amazon s3 bucket using a shell script.

Setup

Assumption the AWS CLI is already installed.

If you haven’t setup the AWS CLI already.

Check out the references for all of the commands used.

Create A Shell Script

Create a shell script file and add the below content. For this post, create the file as follows /scripts/s3WebsiteBackup.sh:

Variables

Name Description
bucket_name Bucket Name
website_directory /path/to/website/
region A physical location around the world where we cluster data centers i.e. us-east-2 …. eu-west-3 …
profile Environment variables to configure the AWS CLI
[default]
region=us-west-2
#!/bin/bash

bucket_name='unique-bucket-name'
website_directory='/path/to/website/'

region='us-east-1'
profile='default'

1. Create a new bucket with a unique name


aws s3 sync \
  --profile $profile \
  --region $region \
  $website_directory "s3://$bucket_name/"

2. Enable public access to the bucket


aws s3api put-public-access-block \
  --profile $profile \
  --region $region \
  --bucket $bucket_name \
  --public-access-block-configuration "BlockPublicAcls=false,IgnorePublicAcls=false,BlockPublicPolicy=false,RestrictPublicBuckets=false"

3. Update the bucket policy for public read access:

# 3. Update the bucket policy for public read access:
aws s3api put-bucket-policy \
  --profile $profile \
  --region $region \
  --bucket $bucket_name \
  --policy "{
  \"Version\": \"2012-10-17\",
  \"Statement\": [
      {
          \"Sid\": \"PublicReadGetObject\",
          \"Effect\": \"Allow\",
          \"Principal\": \"*\",
          \"Action\": \"s3:GetObject\",
          \"Resource\": \"arn:aws:s3:::$bucket_name/*\"
      }
  ]
}"

4. Enable the s3 bucket to host an index and error html page

aws s3 website "s3://$bucket_name" \
  --profile $profile \
  --region $region \
  --index-document index.html \
  --error-document index.html

5. Upload you website

aws s3 sync \
  --profile $profile \
  --region $region \
  $website_directory "s3://$bucket_name/"

6. Error Handling

```bash
if [ $? -eq 0 ]; then
 echo "Backup successfully uploaded to s3 bucket"
else
    echo "Error in s3 backup"
fi

Complete Script

File; /scripts/s3WebsiteBackup.sh

#!/bin/bash

bucket_name='unique-bucket-name'
website_directory='/path/to/website/'

region='us-east-1'
profile='default'

# 1. Create a new bucket with a unique name
aws s3 mb \
  --profile $profile \
  --region $region \
  --region us-east-1 "s3://$bucket_name" 

# 2. Enable public access to the bucket
aws s3api put-public-access-block \
  --profile $profile \
  --region $region \
  --bucket $bucket_name \
  --public-access-block-configuration "BlockPublicAcls=false,IgnorePublicAcls=false,BlockPublicPolicy=false,RestrictPublicBuckets=false"

# 3. Update the bucket policy for public read access:
aws s3api put-bucket-policy \
  --profile $profile \
  --region $region \
  --bucket $bucket_name \
  --policy "{
  \"Version\": \"2012-10-17\",
  \"Statement\": [
      {
          \"Sid\": \"PublicReadGetObject\",
          \"Effect\": \"Allow\",
          \"Principal\": \"*\",
          \"Action\": \"s3:GetObject\",
          \"Resource\": \"arn:aws:s3:::$bucket_name/*\"
      }
  ]
}"

# 4. Enable the s3 bucket to host an `index` and `error` html page
aws s3 website "s3://$bucket_name" \
  --profile $profile \
  --region $region \
  --index-document index.html \
  --error-document index.html

# 5. Upload you website
aws s3 sync \
  --profile $profile \
  --region $region \
  $website_directory "s3://$bucket_name/" 

# 6. Error Handling
if [ $? -eq 0 ]; then
 echo "Backup successfully uploaded to s3 bucket"
else
    echo "Error in s3 backup"
fi

Destroy the bucket:

Create a shell script file and add the below content to destroy S3 bucket. For this post, create the file as follows /scripts/s3WebsiteDestroy.sh:

#!/bin/bash

bucket_name='unique-bucket-name'
website_directory='/path/to/website/'

region='us-east-1'
profile='default'


aws s3 rm \
  --profile $profile \
  --region $region \
  --recursive s3://$bucket_name
  
aws s3api delete-bucket \
  --profile $profile \
  --region $region \
  --bucket $bucket_name

Running Shell Script

Make the shell script executable by running the following command.

chmod +x /scripts/s3WebsiteBackup.sh 

Test the script by executing it manually.

bash /scripts/s3WebsiteBackup.sh 

The backups will uploaded to s3 bucket. View using aws aws s3 ls command.

aws s3 ls
2023-01-24 19:46:02 unique-bucket-name

Schedule Script in Cron

Add the following entry to the crontab:

Edit the crontab of current user, type crontab -e:

0 2 * * * bash /scripts/s3WebsiteBackup.sh 

Summary

This post provided you with shell script to backup website content to the S3 bucket.

Next in the Series:

Do you have an idea or suggestion for a blog post? Submit it here!

Related Posts

2023 Phoenix VMUG UserCon

  • vnull
  • Sep 8, 2023
  • 4 minutes read

Introduction: The recent 2023 Phoenix VMUG UserCon brought together some like-minded people in the field, with discussions ranging from VMware technologies to best practices for optimizing existing systems.

Read more

Red Hat User Group Insights, Ansible Automation Platform, and ITSM Integration

  • vnull
  • Jun 1, 2023
  • 3 minutes read

Introduction: This blog post aims to summarize the key takeaways from this informative workshop. At the recent Red Hat User Group workshop on Red Hat Insights, Red Hat Ansible Automation Platform, and their integration with management (ITSM) systems, such as ServiceNow, provided valuable insights into how these technologies work together.

Read more

Robocopy Examples

  • vnull
  • Feb 10, 2023
  • 5 minutes read

Robocopy Examples Robocopy has many command line options and it can be overwhelming to know which commands to use. In this post, we will take a look at how to ues robocopy to copy, mirror, purge Files and Folders.

Read more