Skip to main content

Data Backup Using AWS S3cmd: a Simple And Effective Solution

Worried about the fact that your data backup plan is a little bit anemic?

AWS S3cmd can help.

Whether or not you’ve already got some kind of data backup system to protect your personal and work files from loss or attack, it’s probably not enough. Your data should ideally be securely and reliably stored far from your home or office. And preferably in more than one location. So even if you’ve already got a USB drive in a drawer somewhere holding an (outdated) archive, and some more kept on Google Drive, it can’t hurt to add another layer.

There are, however, things that get in the way. “It’s too complicated to set up” some will complain. Or “I’ve tried before, but I can never remember to actually update it regularly.”

Do you have an Amazon AWS account? Are you reasonably comfortable with your operating system’s command line? Then I’ve got a dead simple, dirt cheap, bullet-proof DIY data backup plan that you can set up just once a then completely forget about. (Although you should devote a few moments to check on it every now and them). And it’s cheap. $0.03 per GB per month cheap.

Download and install S3cmd

If you haven’t already, you’ll need to get S3cmd working on your system. This process was thoroughly tested on Linux (without causing harm to any animals), and things should work pretty much the same on Mac. I believe that the AWS CLI for Windows has similar functionality.

First of all, if you haven’t already, install Python and wget:

sudo apt-get install python python-setuptools wget

Then, using wget, download the S3cmd package (1.5.2 is currently the latest version):

wget http://sourceforge.net/projects/s3tools/files/s3cmd/1.5.2/s3cmd-1.5.2.tar.gz

Run tar to unpack the archive:

tar xzvf s3cmd-1.5.2.tar.gz

Move into the newly created S3cmd directory:

cd s3cmd-1.5.2

…and run the install program:

sudo python setup.py install

You are now ready to configure S3cmd:

s3cmd --configure

You’ll be asked to provide the Access Key ID and Secret Access Key of the AWS user account through which you’re planning to access S3, along with other authentication, encryption, and account details. The configure program will then offer to test your connectivity to S3, after which it will save your settings and you should be all ready to go.

Create your data backup job

Since a data backup without data doesn’t make a lot of sense, you’ll want to identify exactly which folders and files need backing up. You’ll also want to create a new bucket from the AWS console:
Data Backup: Create a bucket, select a bucket name and region
So let’s suppose that you keep all your important data underneath a directory called work files, and you’ve named your bucket mybackupbucket8387. Here’s what your backup command will look like:

s3cmd sync /home/yourname/workfiles/ s3://mybackupbucket8387/ --delete-removed

The trailing slashes on both the source and target addresses are important, by the way.
Let’s examine this command:

Sync tells the tool that you want to keep the files on the source and target locations synchronized. That means, that an update will first check the contents of both directories, and add copies of any files that exist in one, but not the other. The two addresses simply define which two data locations are to be synced, and –delete-removed tells the tool to remove any files that exist in the S3 bucket, but are no longer present locally.

Depending on how big your data backup will be, the first time you run this might take some time.

Update: to make sure you don’t accidentally remove the wrong files, it’s always a good idea to run a sync command with the dry-run argument before executing it for real:

--dry-run

There are cases when you might not want to use –delete-removed. Perhaps you would prefer to keep older versions of overwritten files archived and available. To do that, simply remove the –delete-removed argument from your command line and enable Versioning on your S3 bucket.
Data Backup: Versioning
If you’d like to reduce your costs even further and also build in an automatic delete for overwritten files that have been sitting long enough, you could use the AWS console to create a Lifecycle rule for your bucket that will transfer previous versions of files that are older than, say thirty days, to Glacier – whose storage costs are only $0.01 per GB per month.
Data Backup: Early deletion of Glacier objects
So that’s a data backup that’s simple and cheap. But it’s not yet at the “set it up and forget about it” stage. There’s still one more really simple step (using my Ubuntu system, at least): create a cron job.

If you’d like to sync your files every hour, you can create a text file containing only these two lines:

#!/bin/bash
s3cmd sync /home/yourname/workfiles/ s3://mybackupbucket8387/

…and, using sudo, save the file to the directory /etc/cron.hourly/

Assuming that you named your file “mybackup”, all that’s left is to make your file executable using:

sudo chmod +x mybackup

You should test it out over a couple of hours to make sure that the backups are actually happening, but that should be the last time you’ll ever have to think about this data backup archive – at least until your PC crashes.

Avatar

Written by

David Clinton

A Linux system administrator with twenty years' experience as a high school teacher, David has been around the industry long enough to have witnessed decades of technology trend predictions; most of them turning out to be dead wrong.

Related Posts

Avatar
John Chell
— June 13, 2019

AWS Certified Solutions Architect Associate: A Study Guide

The AWS Solutions Architect - Associate Certification (or Sol Arch Associate for short) offers some clear benefits: Increases marketability to employers Provides solid credentials in a growing industry (with projected growth of as much as 70 percent in five years) Market anal...

Read more
  • AWS
  • AWS Certifications
Chris Gambino and Joe Niemiec
Chris Gambino and Joe Niemiec
— June 11, 2019

Moving Data to S3 with Apache NiFi

Moving data to the cloud is one of the cornerstones of any cloud migration. Apache NiFi is an open source tool that enables you to easily move and process data using a graphical user interface (GUI).  In this blog post, we will examine a simple way to move data to the cloud using NiFi c...

Read more
  • AWS
  • S3
Avatar
Chandan Patra
— June 11, 2019

Amazon DynamoDB: 10 Things You Should Know

Amazon DynamoDB is a managed NoSQL service with strong consistency and predictable performance that shields users from the complexities of manual setup.Whether or not you've actually used a NoSQL data store yourself, it's probably a good idea to make sure you fully understand the key ...

Read more
  • AWS
  • DynamoDB
Avatar
Andrew Larkin
— June 6, 2019

The 11 AWS Certifications: Which is Right for You and Your Team?

As companies increasingly shift workloads to the public cloud, cloud computing has moved from a nice-to-have to a core competency in the enterprise. This shift requires a new set of skills to design, deploy, and manage applications in cloud computing.As the market leader and most ma...

Read more
  • AWS
  • AWS Certifications
Sam Ghardashem
Sam Ghardashem
— May 15, 2019

Aviatrix Integration of a NextGen Firewall in AWS Transit Gateway

Learn how Aviatrix’s intelligent orchestration and control eliminates unwanted tradeoffs encountered when deploying Palo Alto Networks VM-Series Firewalls with AWS Transit Gateway.Deploying any next generation firewall in a public cloud environment is challenging, not because of the f...

Read more
  • AWS
Joe Nemer
Joe Nemer
— May 3, 2019

AWS Config Best Practices for Compliance

Use AWS Config the Right Way for Successful ComplianceIt’s well-known that AWS Config is a powerful service for monitoring all changes across your resources. As AWS Config has constantly evolved and improved over the years, it has transformed into a true powerhouse for monitoring your...

Read more
  • AWS
  • Compliance
Avatar
Francesca Vigliani
— April 30, 2019

Cloud Academy is Coming to the AWS Summits in Atlanta, London, and Chicago

Cloud Academy is a proud sponsor of the 2019 AWS Summits in Atlanta, London, and Chicago. We hope you plan to attend these free events that bring the cloud computing community together to connect, collaborate, and learn about AWS. These events are all about learning. You can learn how t...

Read more
  • AWS
  • AWS Summits
Paul Hortop
Paul Hortop
— April 2, 2019

How to Monitor Your AWS Infrastructure

The AWS cloud platform has made it easier than ever to be flexible, efficient, and cost-effective. However, monitoring your AWS infrastructure is the key to getting all of these benefits. Realizing these benefits requires that you follow AWS best practices which constantly change as AWS...

Read more
  • AWS
  • Monitoring
Joe Nemer
Joe Nemer
— April 1, 2019

AWS EC2 Instance Types Explained

Amazon Web Services’ resource offerings are constantly changing, and staying on top of their evolution can be a challenge. Elastic Cloud Compute (EC2) instances are one of their core resource offerings, and they form the backbone of most cloud deployments. EC2 instances provide you with...

Read more
  • AWS
  • EC2
Avatar
Nitheesh Poojary
— March 26, 2019

How DNS Works – the Domain Name System (Part One)

Before migrating domains to Amazon's Route53, we should first make sure we properly understand how DNS worksWhile we'll get to AWS's Route53 Domain Name System (DNS) service in the second part of this series, I thought it would be helpful to first make sure that we properly understand...

Read more
  • AWS
Avatar
Stuart Scott
— March 14, 2019

Multiple AWS Account Management using AWS Organizations

As businesses expand their footprint on AWS and utilize more services to build and deploy their applications, it becomes apparent that multiple AWS accounts are required to manage the environment and infrastructure.  A multi-account strategy is beneficial for a number of reasons as ...

Read more
  • AWS
  • Identity Access Management
Avatar
Sanket Dangi
— February 11, 2019

WaitCondition Controls the Pace of AWS CloudFormation Templates

AWS's WaitCondition can be used with CloudFormation templates to ensure required resources are running.As you may already be aware, AWS CloudFormation is used for infrastructure automation by allowing you to write JSON templates to automatically install, configure, and bootstrap your ...

Read more
  • AWS
  • CloudFormation