Data Backup Using AWS S3cmd: a Simple And Effective Solution

Worried about the fact that your data backup plan is a little bit anemic?

AWS S3cmd can help.

Whether or not you’ve already got some kind of data backup system to protect your personal and work files from loss or attack, it’s probably not enough. Your data should ideally be securely and reliably stored far from your home or office. And preferably in more than one location. So even if you’ve already got a USB drive in a drawer somewhere holding an (outdated) archive, and some more kept on Google Drive, it can’t hurt to add another layer.

There are, however, things that get in the way. “It’s too complicated to set up” some will complain. Or “I’ve tried before, but I can never remember to actually update it regularly.”

Do you have an Amazon AWS account? Are you reasonably comfortable with your operating system’s command line? Then I’ve got a dead simple, dirt cheap, bullet-proof DIY data backup plan that you can set up just once a then completely forget about. (Although you should devote a few moments to check on it every now and them). And it’s cheap. $0.03 per GB per month cheap.

Download and install S3cmd

If you haven’t already, you’ll need to get S3cmd working on your system. This process was thoroughly tested on Linux (without causing harm to any animals), and things should work pretty much the same on Mac. I believe that the AWS CLI for Windows has similar functionality.

First of all, if you haven’t already, install Python and wget:

sudo apt-get install python python-setuptools wget

Then, using wget, download the S3cmd package (1.5.2 is currently the latest version):

wget http://sourceforge.net/projects/s3tools/files/s3cmd/1.5.2/s3cmd-1.5.2.tar.gz

Run tar to unpack the archive:

tar xzvf s3cmd-1.5.2.tar.gz

Move into the newly created S3cmd directory:

cd s3cmd-1.5.2

…and run the install program:

sudo python setup.py install

You are now ready to configure S3cmd:

s3cmd --configure

You’ll be asked to provide the Access Key ID and Secret Access Key of the AWS user account through which you’re planning to access S3, along with other authentication, encryption, and account details. The configure program will then offer to test your connectivity to S3, after which it will save your settings and you should be all ready to go.

Create your data backup job

Since a data backup without data doesn’t make a lot of sense, you’ll want to identify exactly which folders and files need backing up. You’ll also want to create a new bucket from the AWS console:
Data Backup: Create a bucket, select a bucket name and region
So let’s suppose that you keep all your important data underneath a directory called work files, and you’ve named your bucket mybackupbucket8387. Here’s what your backup command will look like:

s3cmd sync /home/yourname/workfiles/ s3://mybackupbucket8387/ --delete-removed

The trailing slashes on both the source and target addresses are important, by the way.
Let’s examine this command:

Sync tells the tool that you want to keep the files on the source and target locations synchronized. That means, that an update will first check the contents of both directories, and add copies of any files that exist in one, but not the other. The two addresses simply define which two data locations are to be synced, and –delete-removed tells the tool to remove any files that exist in the S3 bucket, but are no longer present locally.

Depending on how big your data backup will be, the first time you run this might take some time.

Update: to make sure you don’t accidentally remove the wrong files, it’s always a good idea to run a sync command with the dry-run argument before executing it for real:

--dry-run

There are cases when you might not want to use –delete-removed. Perhaps you would prefer to keep older versions of overwritten files archived and available. To do that, simply remove the –delete-removed argument from your command line and enable Versioning on your S3 bucket.
Data Backup: Versioning
If you’d like to reduce your costs even further and also build in an automatic delete for overwritten files that have been sitting long enough, you could use the AWS console to create a Lifecycle rule for your bucket that will transfer previous versions of files that are older than, say thirty days, to Glacier – whose storage costs are only $0.01 per GB per month.
Data Backup: Early deletion of Glacier objects
So that’s a data backup that’s simple and cheap. But it’s not yet at the “set it up and forget about it” stage. There’s still one more really simple step (using my Ubuntu system, at least): create a cron job.

If you’d like to sync your files every hour, you can create a text file containing only these two lines:

#!/bin/bash
s3cmd sync /home/yourname/workfiles/ s3://mybackupbucket8387/

…and, using sudo, save the file to the directory /etc/cron.hourly/

Assuming that you named your file “mybackup”, all that’s left is to make your file executable using:

sudo chmod +x mybackup

You should test it out over a couple of hours to make sure that the backups are actually happening, but that should be the last time you’ll ever have to think about this data backup archive – at least until your PC crashes.

Avatar

Written by

David Clinton

A Linux system administrator with twenty years' experience as a high school teacher, David has been around the industry long enough to have witnessed decades of technology trend predictions; most of them turning out to be dead wrong.


Related Posts

Avatar
Stuart Scott
— October 16, 2019

AWS Security: Bastion Host, NAT instances and VPC Peering

Effective security requires close control over your data and resources. Bastion hosts, NAT instances, and VPC peering can help you secure your AWS infrastructure. Welcome to part four of my AWS Security overview. In part three, we looked at network security at the subnet level. This ti...

Read more
  • AWS
Avatar
Sudhi Seshachala
— October 9, 2019

Top 13 Amazon Virtual Private Cloud (VPC) Best Practices

Amazon Virtual Private Cloud (VPC) brings a host of advantages to the table, including static private IP addresses, Elastic Network Interfaces, secure bastion host setup, DHCP options, Advanced Network Access Control, predictable internal IP ranges, VPN connectivity, movement of interna...

Read more
  • AWS
  • best practices
  • VPC
Avatar
Stuart Scott
— October 2, 2019

Big Changes to the AWS Certification Exams

With AWS re:Invent 2019 just around the corner, we can expect some early announcements to trickle through with upcoming features and services. However, AWS has just announced some big changes to their certification exams. So what’s changing and what’s new? There is a brand NEW ...

Read more
  • AWS
  • Certifications
Alisha Reyes
Alisha Reyes
— October 1, 2019

New on Cloud Academy: ITIL® 4, Microsoft 365 Tenant, Jenkins, TOGAF® 9.1, and more

At Cloud Academy, we're always striving to make improvements to our training platform. Based on your feedback, we released some new features to help make it easier for you to continue studying. These new features allow you to: Remove content from “Continue Studying” section Disc...

Read more
  • AWS
  • Azure
  • Google Cloud Platform
  • ITIL® 4
  • Jenkins
  • Microsoft 365 Tenant
  • New content
  • Product Feature
  • Python programming
  • TOGAF® 9.1
Avatar
Stuart Scott
— September 27, 2019

AWS Security Groups: Instance Level Security

Instance security requires that you fully understand AWS security groups, along with patching responsibility, key pairs, and various tenancy options. As a precursor to this post, you should have a thorough understanding of the AWS Shared Responsibility Model before moving onto discussi...

Read more
  • AWS
  • instance security
  • Security
  • security groups
Avatar
Jeremy Cook
— September 17, 2019

Cloud Migration Risks & Benefits

If you’re like most businesses, you already have at least one workload running in the cloud. However, that doesn’t mean that cloud migration is right for everyone. While cloud environments are generally scalable, reliable, and highly available, those won’t be the only considerations dri...

Read more
  • AWS
  • Azure
  • Cloud Migration
Joe Nemer
Joe Nemer
— September 12, 2019

Real-Time Application Monitoring with Amazon Kinesis

Amazon Kinesis is a real-time data streaming service that makes it easy to collect, process, and analyze data so you can get quick insights and react as fast as possible to new information.  With Amazon Kinesis you can ingest real-time data such as application logs, website clickstre...

Read more
  • amazon kinesis
  • AWS
  • Stream Analytics
  • Streaming data
Joe Nemer
Joe Nemer
— September 6, 2019

Google Cloud Functions vs. AWS Lambda: The Fight for Serverless Cloud Domination

Serverless computing: What is it and why is it important? A quick background The general concept of serverless computing was introduced to the market by Amazon Web Services (AWS) around 2014 with the release of AWS Lambda. As we know, cloud computing has made it possible for users to ...

Read more
  • AWS
  • Azure
  • Google Cloud Platform
Joe Nemer
Joe Nemer
— September 3, 2019

Google Vision vs. Amazon Rekognition: A Vendor-Neutral Comparison

Google Cloud Vision and Amazon Rekognition offer a broad spectrum of solutions, some of which are comparable in terms of functional details, quality, performance, and costs. This post is a fact-based comparative analysis on Google Vision vs. Amazon Rekognition and will focus on the tech...

Read more
  • Amazon Rekognition
  • AWS
  • Google Cloud Platform
  • Google Vision
Alisha Reyes
Alisha Reyes
— August 30, 2019

New on Cloud Academy: CISSP, AWS, Azure, & DevOps Labs, Python for Beginners, and more…

As Hurricane Dorian intensifies, it looks like Floridians across the entire state might have to hunker down for another big one. If you've gone through a hurricane, you know that preparing for one is no joke. You'll need a survival kit with plenty of water, flashlights, batteries, and n...

Read more
  • AWS
  • Azure
  • Google Cloud Platform
  • New content
  • Product Feature
  • Python programming
Joe Nemer
Joe Nemer
— August 27, 2019

Amazon Route 53: Why You Should Consider DNS Migration

What Amazon Route 53 brings to the DNS table Amazon Route 53 is a highly available and scalable Domain Name System (DNS) service offered by AWS. It is named by the TCP or UDP port 53, which is where DNS server requests are addressed. Like any DNS service, Route 53 handles domain regist...

Read more
  • Amazon
  • AWS
  • Cloud Migration
  • DNS
  • Route 53
Alisha Reyes
Alisha Reyes
— August 22, 2019

How to Unlock Complimentary Access to Cloud Academy

Are you looking to get trained or certified on AWS, Azure, Google Cloud Platform, DevOps, Cloud Security, Python, Java, or another technical skill? Then you'll want to mark your calendars for August 23, 2019. Starting Friday at 12:00 a.m. PDT (3:00 a.m. EDT), Cloud Academy is offering c...

Read more
  • AWS
  • Azure
  • cloud academy content
  • complimentary access
  • GCP
  • on the house