Data backup using AWS S3cmd: a simple and effective solution

Worried about the fact that your data backup plan is a little bit anemic? AWS S3cmd can help.
Whether or not you’ve already got some kind of data backup system to protect your personal and work files from loss or attack, it’s probably not enough. Your data should ideally be securely and reliably stored far from your home or office. And preferably in more than one location. So even if you’ve already got a USB drive in a drawer somewhere holding an (outdated) archive, and some more kept on Google Drive, it can’t hurt to add another layer.
There are, however, things that get in the way. “It’s too complicated to set up” some will complain. Or “I’ve tried before, but I can never remember to actually update it regularly.”
Do you have an Amazon AWS account? Are you reasonably comfortable with your operating system’s command line? Then I’ve got a dead simple, dirt cheap, bullet-proof DIY data backup plan that you can setup just once a then completely forget about. (Although you should devote a few moments to check on it every now and them). And it’s cheap. $0.03 per GB per month cheap.

Download and install S3cmd

If you haven’t already, you’ll need to get S3cmd working on your system. This process was thoroughly tested on Linux (without causing harm to any animals), and things should work pretty much the same on Mac. I believe that the AWS CLI for Windows has similar functionality.
First of all, if you haven’t already, install Python and wget:

sudo apt-get install python python-setuptools wget

Then, using wget, download the S3cmd package (1.5.2 is currently the latest version):

wget http://sourceforge.net/projects/s3tools/files/s3cmd/1.5.2/s3cmd-1.5.2.tar.gz

Run tar to unpack the archive:

tar xzvf s3cmd-1.5.2.tar.gz

Move into the newly created S3cmd directory:

cd s3cmd-1.5.2

…and run the install program:

sudo python setup.py install

You are now ready to configure S3cmd:

s3cmd --configure

You’ll be asked to provide the Access Key ID and Secret Access Key of the AWS user account through which you’re planning to access S3, along with other authentication, encryption, and account details. The configure program will then offer to test your connectivity to S3, after which it will save your settings and you should be all ready to go.

Create your data backup job

Since a data backup without data doesn’t make a lot of sense, you’ll want to identify exactly which folders and files need backing up. You’ll also want to create a new bucket from the AWS console:
data-backup
So let’s suppose that you keep all your important data underneath a directory called workfiles, and you’ve named your bucket mybackupbucket8387. Here’s what your backup command will look like:

s3cmd sync /home/yourname/workfiles/ s3://mybackupbucket8387/ --delete-removed

The trailing slashes on both the source and target addresses are important, by the way.
Let’s examine this command:
Sync tells the tool that you want to keep the files on the source and target locations synchronized. That means, that an update will first check the contents of both directories, and add copies of any files that exist in one, but not the other. The two addresses simply define which two data locations are to be synced, and –delete-removed tells the tool to remove any files that exist in the S3 bucket, but are no longer present locally.
Depending on how big your data backup will be, the first time you run this might take some time.
Update: to make sure you don’t accidentally remove the wrong files, it’s always a good idea to run a sync command with the dry-run argument before executing it for real:

--dry-run

There are cases when you might not want to use –delete-removed. Perhaps you would prefer to keep older versions of overwritten files archived and available. To do that, simply remove the –delete-removed argument from your command line and enable Versioning on your S3 bucket.
data-backup-s3cmd
If you’d like to reduce your costs even further and also build in an automatic delete for overwritten files that have been sitting long enough, you could use the AWS console to create a Lifecycle rule for your bucket that will transfer previous versions of files that are older than, say thirty days, to Glacier – whose storage costs are only $0.01 per GB per month.
s3cmd
So that’s a data backup that’s simple and cheap. But it’s not yet at the “set it up and forget about it” stage. There’s still one more really simple step (using my Ubuntu system, at least): create a cron job.
If you’d like to sync your files every hour, you can create a text file containing only these two lines:

#!/bin/bash
s3cmd sync /home/yourname/workfiles/ s3://mybackupbucket8387/

…and, using sudo, save the file to the directory /etc/cron.hourly/
Assuming that you named your file “mybackup”, all that’s left is to make your file executable using:

sudo chmod +x mybackup

You should test it out over a couple of hours to make sure that the backups are actually happening, but that should be the last time you’ll ever have to think about this data backup archive – at least until your PC crashes.

Written by

A Linux system administrator with twenty years' experience as a high school teacher, David has been around the industry long enough to have witnessed decades of technology trend predictions; most of them turning out to be dead wrong.

Related Posts

— November 28, 2018

Two New EC2 Instance Types Announced at AWS re:Invent 2018 – Monday Night Live

Let’s look at what benefits these two new EC2 instance types offer and how these two new instances could be of benefit to you. Both of the new instance types are built on the AWS Nitro System. The AWS Nitro System improves the performance of processing in virtualized environments by...

Read more
  • AWS
  • EC2
  • re:Invent 2018
— November 21, 2018

Google Cloud Certification: Preparation and Prerequisites

Google Cloud Platform (GCP) has evolved from being a niche player to a serious competitor to Amazon Web Services and Microsoft Azure. In 2018, research firm Gartner placed Google in the Leaders quadrant in its Magic Quadrant for Cloud Infrastructure as a Service for the first time. In t...

Read more
  • AWS
  • Azure
  • Google Cloud
Khash Nakhostin
— November 13, 2018

Understanding AWS VPC Egress Filtering Methods

Security in AWS is governed by a shared responsibility model where both vendor and subscriber have various operational responsibilities. AWS assumes responsibility for the underlying infrastructure, hardware, virtualization layer, facilities, and staff while the subscriber organization ...

Read more
  • Aviatrix
  • AWS
  • VPC
— November 10, 2018

S3 FTP: Build a Reliable and Inexpensive FTP Server Using Amazon’s S3

Is it possible to create an S3 FTP file backup/transfer solution, minimizing associated file storage and capacity planning administration headache?FTP (File Transfer Protocol) is a fast and convenient way to transfer large files over the Internet. You might, at some point, have conf...

Read more
  • Amazon S3
  • AWS
— October 18, 2018

Microservices Architecture: Advantages and Drawbacks

Microservices are a way of breaking large software projects into loosely coupled modules, which communicate with each other through simple Application Programming Interfaces (APIs).Microservices have become increasingly popular over the past few years. The modular architectural style,...

Read more
  • AWS
  • Microservices
— October 2, 2018

What Are Best Practices for Tagging AWS Resources?

There are many use cases for tags, but what are the best practices for tagging AWS resources? In order for your organization to effectively manage resources (and your monthly AWS bill), you need to implement and adopt a thoughtful tagging strategy that makes sense for your business. The...

Read more
  • AWS
  • cost optimization
— September 26, 2018

How to Optimize Amazon S3 Performance

Amazon S3 is the most common storage options for many organizations, being object storage it is used for a wide variety of data types, from the smallest objects to huge datasets. All in all, Amazon S3 is a great service to store a wide scope of data types in a highly available and resil...

Read more
  • Amazon S3
  • AWS
— September 18, 2018

How to Optimize Cloud Costs with Spot Instances: New on Cloud Academy

One of the main promises of cloud computing is access to nearly endless capacity. However, it doesn’t come cheap. With the introduction of Spot Instances for Amazon Web Services’ Elastic Compute Cloud (AWS EC2) in 2009, spot instances have been a way for major cloud providers to sell sp...

Read more
  • AWS
  • Azure
  • Google Cloud
— August 23, 2018

What are the Benefits of Machine Learning in the Cloud?

A Comparison of Machine Learning Services on AWS, Azure, and Google CloudArtificial intelligence and machine learning are steadily making their way into enterprise applications in areas such as customer support, fraud detection, and business intelligence. There is every reason to beli...

Read more
  • AWS
  • Azure
  • Google Cloud
  • Machine Learning
— August 17, 2018

How to Use AWS CLI

The AWS Command Line Interface (CLI) is for managing your AWS services from a terminal session on your own client, allowing you to control and configure multiple AWS services.So you’ve been using AWS for awhile and finally feel comfortable clicking your way through all the services....

Read more
  • AWS
Albert Qian
— August 9, 2018

AWS Summit Chicago: New AWS Features Announced

Thousands of cloud practitioners descended on Chicago’s McCormick Place West last week to hear the latest updates around Amazon Web Services (AWS). While a typical hot and humid summer made its presence known outside, attendees inside basked in the comfort of air conditioning to hone th...

Read more
  • AWS
  • AWS Summits
— August 8, 2018

From Monolith to Serverless – The Evolving Cloudscape of Compute

Containers can help fragment monoliths into logical, easier to use workloads. The AWS Summit New York was held on July 17 and Cloud Academy sponsored my trip to the event. As someone who covers enterprise cloud technologies and services, the recent Amazon Web Services event was an insig...

Read more
  • AWS
  • AWS Summits
  • Containers
  • DevOps
  • serverless