Since the need for a reliable data archive is well known, there’s no need for us to focus on that. Instead, we’ll discuss the various data archive options AWS offers its customers. However, we should first make an important distinction between data archives and data backups – as the purpose and function of the two should not be confused.
A data archive is for data not actively in use, but that needs to be moved to a separate storage device for preservation and retention over the long term. Besides preservation, a key goal of a data archive is to reduce the cost of storage. A data archive is not intended to help your system recover from some disaster or failure. Backups – which are performed on both active and inactive data – are designed to permit recovery from data failure.
Bearing that in mind, being able to quickly restore data from a backup medium is likely far more important than it would be for an archive. Such considerations will define the kind of ideal solutions you might choose for your data archive vs. your data backup.
The data archive: traditional considerations
- You would need to see far into the future, as the medium you are using today may not exist in ten years. It can therefore be a real challenge to identify a viable long term storage platform.
- While archived data may not be currently active, they are generally intended for production use. Therefore, reliable security over long periods of time becomes a critical goal.
- Data archives tend to grow with time, so you will need to realistically consider future costs and scalable infrastructure needs upfront.
- Most organizations – and especially governments – are very particular about the availability of archived data. The process of meeting such expectations may lead you to improved disaster recovery strategies, but it can also be really complicated and expensive.
- Implementation can require significant skills and experience across multiple technologies.
However, many of these concerns simply wouldn’t apply to a data archive in the Cloud. Using AWS, for instance, means you never need to invest in a particular technology or medium, or worry about changing standards – that’s all Amazon’s headache. And your costs will always be a direct product of the services you actually use.
Your data archive and the AWS Cloud
Of course, AWS isn’t the only player offering out-of-the-box cloud archiving services, but they’re a good place to start.
S3 and Glacier are, one way or another, the primary AWS tools you’ll use for your data archive. We’ll look at three common use-case scenarios: archives for AWS-based data, on-premise data, and hybrid data solutions.
1. Applications deployed within AWS
If the application to be archived is running within the AWS environment, then integration with S3 or Glacier should be straightforward. Since a data archive doesn’t demand frequent reads, you would normally opt for the cheaper Glacier, which can require a lag of several hours for retrieval. If, however, you’re already storing some application data in S3 (like videos or application logs) and you may not want to write the extra code needed to move inactive data to Glacier, you may instead consider moving only the old, inactive data from S3 to Glacier.
AWS allows you to configure and manage the automated lifecycle of objects in your S3 buckets. You could therefore create a configuration that causes S3 objects to be moved to Glacier based on specified conditions or policies. A sample policy may look like this:
2. Applications deployed on premises.
If the components of your application (like a webserver, database, application server, and NFS server) are running within your datacenter, but you still want to use AWS for archiving your backed up data, the simplest solution is to integrate your backup server with AWS S3 or Glacier. This diagram may help you visualize the architecture:
If you’re already using AWS S3 for your backups instead of a local backup server, then you can use S3 Lifecycle management to quickly add a data archive layer using Glacier to your infrastructure.
Even if your backup server doesn’t natively support AWS cloud integration, you can still create a seamless and secure interface between your data center and AWS’s storage infrastructure using AWS Storage Gateway. Storage Gateway won’t require a dedicated network setup between your corporate network and AWS infrastructure, and it is built to support industry standard storage protocols, while storing the encrypted data in AWS S3.
3. Applications deployed in a hybrid setup
In this kind of setup, an application deployed on AWS might interact with on-premise components (or the other way around). In such cases, you may want to extend an existing archiving strategy to the cloud, requiring only a reliable way to connect your two networks via either a standard VPN setup or through AWS Direct Connect, which makes it easy to establish a dedicated network connection from your premises to AWS.
Data archive compliance and regulations
Many customers will have specific data retention policies, and must often comply with regulatory guidelines. AWS Glacier offers you Vault Locks. A Vault Lock Policy allows you to apply compliance controls to the contents of any Glacier vault.
To review, here are some of the key advantages you can enjoy by archiving your data in the cloud…and with AWS in particular:
- No more need to rely on risky predictions of your data growth and corresponding data storage.
- Reduced overhead of managing huge data stores for long periods.
- Reduced cost.
- Increased availability.
- No more need to identify and invest in some particular hardware and skills to implement a reliable, long-term archival design.
Do you have your own cloud/local archiving experience? Let us know in the comments.
WaitCondition Controls the Pace of AWS CloudFormation Templates
AWS's WaitCondition can be used with CloudFormation templates to ensure required resources are running.As you may already be aware, AWS CloudFormation is used for infrastructure automation by allowing you to write JSON templates to automatically install, configure, and bootstrap your ...
The 9 AWS Certifications: Which is Right for You and Your Team?
As companies increasingly shift workloads to the public cloud, cloud computing has moved from a nice-to-have to a core competency in the enterprise. This shift requires a new set of skills to design, deploy, and manage applications in the cloud.As the market leader and most mature p...
Two New EC2 Instance Types Announced at AWS re:Invent 2018 – Monday Night Live
The announcements at re:Invent just keep on coming! Let’s look at what benefits these two new EC2 instance types offer and how these two new instances could be of benefit to you. If you're not too familiar with Amazon EC2, you might want to familiarize yourself by creating your first Am...
Google Cloud Certification: Preparation and Prerequisites
Google Cloud Platform (GCP) has evolved from being a niche player to a serious competitor to Amazon Web Services and Microsoft Azure. In 2018, research firm Gartner placed Google in the Leaders quadrant in its Magic Quadrant for Cloud Infrastructure as a Service for the first time. In t...
Understanding AWS VPC Egress Filtering Methods
In order to understand AWS VPC egress filtering methods, you first need to understand that security on AWS is governed by a shared responsibility model where both vendor and subscriber have various operational responsibilities. AWS assumes responsibility for the underlying infrastructur...
S3 FTP: Build a Reliable and Inexpensive FTP Server Using Amazon’s S3
Is it possible to create an S3 FTP file backup/transfer solution, minimizing associated file storage and capacity planning administration headache?FTP (File Transfer Protocol) is a fast and convenient way to transfer large files over the Internet. You might, at some point, have conf...
Microservices Architecture: Advantages and Drawbacks
Microservices are a way of breaking large software projects into loosely coupled modules, which communicate with each other through simple Application Programming Interfaces (APIs).Microservices have become increasingly popular over the past few years. The modular architectural style,...
What Are Best Practices for Tagging AWS Resources?
There are many use cases for tags, but what are the best practices for tagging AWS resources? In order for your organization to effectively manage resources (and your monthly AWS bill), you need to implement and adopt a thoughtful tagging strategy that makes sense for your business. The...
How to Optimize Amazon S3 Performance
Amazon S3 is the most common storage options for many organizations, being object storage it is used for a wide variety of data types, from the smallest objects to huge datasets. All in all, Amazon S3 is a great service to store a wide scope of data types in a highly available and resil...
How to Optimize Cloud Costs with Spot Instances: New on Cloud Academy
One of the main promises of cloud computing is access to nearly endless capacity. However, it doesn’t come cheap. With the introduction of Spot Instances for Amazon Web Services’ Elastic Compute Cloud (AWS EC2) in 2009, spot instances have been a way for major cloud providers to sell sp...
What are the Benefits of Machine Learning in the Cloud?
A Comparison of Machine Learning Services on AWS, Azure, and Google CloudArtificial intelligence and machine learning are steadily making their way into enterprise applications in areas such as customer support, fraud detection, and business intelligence. There is every reason to beli...
How to Use AWS CLI
The AWS Command Line Interface (CLI) is for managing your AWS services from a terminal session on your own client, allowing you to control and configure multiple AWS services.So you’ve been using AWS for awhile and finally feel comfortable clicking your way through all the services....