Cheerful Curmudgeon

A complete lack of ideas and the power to express them.

  • Home
  • About Me
    • Art Zemon’s PGP Key
    • Privacy Policy
  • Bede BD-4C
    • Hall of Fame
  • Piper Arrow

Amazon S3 for Photo Backups

June 11, 2017 Art Zemon

I have a little problem. Well, maybe it’s a big problem. I need a backup of the /home2/art  directory on my computer. All told, it contains about 19,000 photos, a few full length movies, the video from my wedding, and some audio. It weights in at 227 GB. My /home/art  directory has another 17,000 photos but, since it is “only” 91 GB, it already has a good backup. Last week, I found a solution. To make a backup, I type this at the command line and then I wait:

aws s3 sync /home2/art s3://bd4c/home2/ --delete

And if I ever need to restore the directory, I will type this:

aws s3 sync s3://bd4c/home2/ /home2/art

Read on to learn how to do this yourself. It works on Windows, Mac, and Linux.

Requirements

To arrive at that backup solution, I started with a few requirements:

  1. It has to run on Linux.
  2. Backups have to easy and scriptable. I want to schedule (or manually run) a simple command and have the backups “just happen.” If something goes wrong (like a power failure or an internet glitch), I want to just run the command again and everything should magically heal itself.
  3. Restorations have to be super easy. Whether I want a single file or a directory of files or all of my files restored, I want to just run a simple command and get my files back. As with the backups, if something goes wrong, I want to just rerun the command and the problem should vanish.
  4. It has to be reliable. Once I set the system up, I want to be absolutely certain that my files will be there if (when) I need them.
  5. It has to be cheap.

There are a few things which I do not need, and this simplifies the issue.

  • I do not need point-in-time recovery. I.e., I will never want to restore things “as they were 10 days ago.” I just want an archival copy of my pictures. I add things. I don’t delete them.
  • I do not have a preconceived notion of where the backups get stored. I do not care whether they are near me or “in the cloud.”
  • I do not care whether a restore takes a long time, even several days.

Amazon Web Services (AWS) has two services which, combined, do everything that I need.

Amazon Glacier provides highly reliable, long term file storage for a measly $0.004 per GB. “It is designed to provide average annual durability of 99.999999999%” for each file. Those seem like pretty good odds to me.

Amazon Simple Storage Service (S3) has a command line program which will let me back up and restore files with a trivial amount of effort. It also lets me define “lifecycle rules” which will automatically move the files from the relatively expensive S3 standard storage class ($0.023 per GB) to the much less expensive Glacier storage class ($0.004 per GB).

Overview

Here are the steps we will follow:

  1. Create the S3 bucket
  2. Create the lifecycle rule
  3. Create the AWS IAM user
  4. Install the AWS command line interface tools
  5. Create the backup

Steps 1-4 only need to be done once.

1) Create the S3 Bucket

Creating an AWS S4 bucket
Creating an AWS S4 bucket

Using the AWS Console, I started by creating an Amazon S3 bucket named “bd4c,” because my desktop computer is named “bd4c.” (I know… too clever.) I selected the US East (Ohio) region for its low price. I enabled versioning, so that deleted files can still be recovered, even though I almost never delete files from these directories. (You can click on any of these screen snapshot to see versions that are large enough to read.)

2) Create the Lifecycle Rule

Creating an AWS S3 bucket lifecycle rule
Creating an AWS S3 bucket lifecycle rule

Still using the AWS Console, I created a lifecycle rule to reduce costs of the backups, by automatically switching the files to the least expensive storage class. This was as easy as clicking Management -> Lifecycle -> “Add lifecycle rule” and selecting these options:

  • Newly uploaded files have the Standard storage class which costs $0.023 per GB.
  • 30 days after a file is uploaded, it’s storage class gets changed from Standard to Standard Infrequent Access (Standard-IA). This drops the price to $0.0125 per GB. It adds the constraint that I will pay for 30 days of storage for each file, even if I delete it sooner, but that is OK because this is archival storage.
  • 90 days after a file is uploaded, it’s storage class gets changed from Standard Infrequent Access to Glacier. This drops the price to a miserly $0.004 per GB. Glacier adds three constraints:
    • The files are not always accessible. To download a file, I first have to initiate a restore. This typically takes 3-5 hours, though it can be expedited if I pay a higher fee.
    • The minimum storage time is 90 days. If I delete a file sooner, I still pay for 90 days.
    • The minimum storage size is 128 KB. Since I am backing up big files (photos and videos), this will not affect me.
  • After 90 days, the lifecycle rule deletes previous versions of files. It does not move them to Glacier.

3) Create an AWS IAM User

I used the AWS Console to create an IAM user that has access restricted to S3. After creating the user, I noted the access key and the secret access key.

4) Install the AWS Command Line Interface (AWS CLI)

I run Ubuntu Linux on my computer so there are two easy ways to install the AWS command line utilities. I can get them from the official Ubuntu repositories with sudo aptitude install awscli or I can get them from the Python repositories with sudo pip install awscli. I chose PIP because it gets updated more frequently than the Ubuntu version.

I ran aws configure and entered the access keys. I also set the default region to us-east-2 , the US East (Ohio) region which I mentioned earlier.

5) Create a Backup

Now I can create a backup by running this command

aws s3 sync /home2/art s3://bd4c/home2/ --delete

The synchronized my local directory with the S3 bucket, deleting any remote files which no longer exist on the local disk.

Here is the result, after several days of uploading:

$ aws s3 ls s3://bd4c/home2 --recursive --human-readable --summarize | tail -2
Total Objects: 54974
   Total Size: 226.4 GiB

Costs

Costs are cheap. For my 227 GB, I will pay approximately

  • $5.25 for month 1
  • $2.85 per month for months 2 and 3
  • $0.91 per month thereafter

Should I ever need to restore some files, I check to see if the files are in Glacier. If they are, I use the AWS Console to initiate a restore and wait for that operation to complete. It does incur a cost of $0.01 per GB and takes 3-5 hours but, if I am in a hurry, I can request an expedited restore for $0.03 per GB; that will complete in 5 minutes. Once the S3 restore is done, I can run aws s3 sync to copy the files from S3 back to my computer.

I will also pay to transfer the data out of S3, should I ever need to restore it. Current price is $0.09 per GB, or $20.43 for the entire 227 GB. If I never do a restore then I never incur this cost.

Conclusion

This achieved all of my goals. The aws s3 sync command is reliable and restartable. With the –delete  option, it makes the S3 bucket look almost exactly like my local files. (S3 does not store empty directories.) It is smart enough to only transfer files which are different, saving time.

Internet, Software, Technology

Recent Posts

  • Stretching a Photo April 21, 2025
  • There are Elephants in the Room April 10, 2025
  • Let’s Eliminate Real WFA April 1, 2025
  • Thumb Wrist Neck Waist Height March 18, 2025
  • Avoid Targeted Advertisements February 5, 2025

About Art Zemon

Omni-curious geek. Husband. Father. Airplane builder & pilot. Bicyclist. Photographer. Computer engineer.

Categories

  • Aviation (261)
    • Bede BD-4C (174)
    • Soaring (5)
  • Bicycling (37)
    • St. Louis to Atlanta (8)
    • St. Peters to Minneapolis (18)
  • Business (48)
  • Cabbages & Kings (24)
  • Communicating (37)
  • Ecology (21)
  • Economy (8)
  • Family (35)
  • Finding the Good (43)
  • Fun (188)
    • Six Word Stories (8)
  • Gardening (5)
  • Genealogy (5)
  • Government (35)
  • Health (67)
  • Judaism (10)
  • Men (12)
  • Mideast (5)
  • Movies (8)
  • Philosophy (15)
  • Photography (27)
  • Rants & Raves (103)
  • Recommendations (35)
  • Safety (37)
  • Science (22)
    • Biology (7)
    • Physics (7)
    • Pyschology (3)
  • Technology (195)
    • eBooks (7)
    • Internet (66)
    • Software (63)
    • VOIP (5)
  • Travel (43)
  • Tzedakah (12)
  • Women (5)

You Will Also Like

  • Art Zemon's Genealogy
  • Art Zemon's Photos
  • Mastodon @babka.social
  • Mastodon @raphus.social

Search

#DonorForLife

6 gallon blood donor badge
#DonorForLife - Give Blood - Save Lives

Archives

Copyright © 2025 · Daily Dish Pro Theme on Genesis Framework · WordPress · Log in