Backing up photos in AWS S3 Glacier

Kamil Burczyk
6 min readFeb 4, 2021

--

If you were wondering how and where to back up your photos in a cheap and safe way, seek no further. Amazon S3 Glacier is here to help.

A good practice is to follow a 3–2–1 backup rule, which says to have 3 copies of your data on 2 different media, 1 of them stored elsewhere.

I assume you already have your photos somewhere, so in that case, AWS S3 ticks the last two boxes (another medium, stored far from your main device).

I’m using the Apple ecosystem, so my main photos storage is the iCloud and Photos app.

At the end of each year I export the whole year of photos, add some additional files if I have them (e.g. camera RAW files or some external recordings), pack and send to the cloud.

You may be wondering how costly is such storage? Let’s find out!

There is a simple AWS cost calculator where you can estimate your spending based on the amount of data you’re going to store. Simply enter the expected values, e.g.:

To find out that the astonishing cost will be around $0.19 monthly :)

In that place you can see what is the cost difference between different storage classes. Amazon S3 Glacier Deep Archive is by far the cheapest yet still extremely durable storage.

Exporting from Photos app

I start with creating a New Smart Album with the name of the year and proper dates.

Next, I just select a whole album and click Export.

A tip is to change File Naming to “Sequentially” for 2 reasons:
1. If export goes wrong you can easily start from some specific offset, not exporting whole year from scratch,
2. You are sure your photos and videos are exported in chronological order.

When you have a whole folder let’s zip it, just to make it easier (and cheaper) to use with AWS.

Why AWS S3 Glacier Deep Archive?

AWS S3 (Amazon Web Services Simple Storage System) is the market-leading cloud storage. It offers multiple storage options where generally you pay more for instant access and storing your data in multiple copies and less for infrequent access.

From my perspective, I’m looking for something which is the last resort, if I lose all my devices and iCloud backup, only then I’m going to need a cloud backup.

Looking at the pricing reveals that S3 Glacier Deep Archive is the most wallet-friendly option.

The main takeaways are: it offers 99.999999999% of durability which, if put into perspective means, if you stored 10 000 objects you can expect 1 of them to be corrupted each 10 000 000 years and retrieval time is measured in hours (up to 12 hours).

Yes, 12 hours seems like a long time, but remember, you are not going to access it the same way as your iPhone gallery, it’s cold storage out there to save you when all other backups fail.

Preparing an S3 bucket

First, let’s start with logging in to the AWS Console. If you don’t have an account in AWS yet, you’ll need to create one and select your payment method (e.g. credit card).

In order to store any file in S3, you need to create a bucket, which is basically a directory in the cloud. The caveat is that its name needs to be unique worldwide.

In your AWS S3 console click “Create bucket”:

Choose a unique Bucket name and make sure that Block all public access option is checked. You want your photos to remain private to you after all. You can also choose the Region which is the closest to you, to have the best possible connection.

Once the bucket is created you can upload a file manually or do it through the command line. From my experience, manual upload works fine, but mostly for smaller files. If you have a big archive and not a very fast connection, it may be more beneficial to use the command line option, described later.

If you choose to upload a file manually there’s a Storage class option you can select and choose the one that fits you the most. I’m going with Glacier Deep Archive.

After uploading a file you can see that it’s stored in Deep Archive and you can’t access it instantly (remember the ‘up to 12 hours’ retrieval time?). You can only Initiate restore and get notified when the file is brought back and copied to standard S3.

Uploading files from the command line

First, you need to start with creating a new user in IAM in order not to use your main root account to mess up with a command-line tool directly. The process is pretty straightforward.

You start with adding a new user with a dedicated name and Programmatic access.

You need to add proper permissions otherwise a user won’t be able to do anything. You can do it by creating a group and adding the user to it or by attaching a policy directly. In our case we care about AmazonGlacierFullAccess.

Tags are not important, then comes the summary and finally the access keys. The best way is to download the .csv with the keys to use it later.

In order to use AWS from the command-line, you need to install command-line tools for your platform: https://docs.aws.amazon.com/cli/latest/userguide/cli-chap-install.html

After you do it you should run aws configure and then you’ll be able to use the generate Access Key:

$ aws configure 
AWS Access Key ID [None]: YOUR_ACCESS_KEY_ID
AWS Secret Access Key [None]: secret...
Default region name [None]: eu-central-1
Default output format [None]: json

After that the actual command is really simple:

aws s3 cp photos-2020.zip s3://YOUR-BUCKET-NAME/ --storage-class DEEP_ARCHIVE

You should expect output similar to:

upload: ./photos-2020-command-line.zip to s3://kamil-photo-backup-1234/photos-2020-command-line.zip

And if all went good you should be able to see a file in your bucket:

Congratulations! You’ve just created your durable cloud-based storage and successfully stored your data.

AWS gives you plenty of additional options: encryption, regions, file versioning. Feel free to experiment and have your data safely stored!

--

--

Kamil Burczyk

Head of Engineering at William Hill, fitness and technology geek, Apple fan