Since the need for a reliable data archive is well known, there’s no need for us to focus on that. Instead, we’ll discuss the various data archive options AWS offers its customers. However, we should first make an important distinction between data archives and data backups – as the purpose and function of the two should not be confused.
A data archive is for data not actively in use, but that needs to be moved to a separate storage device for preservation and retention over the long term. Besides preservation, a key goal of a data archive is to reduce the cost of storage. A data archive is not intended to help your system recover from some disaster or failure. Backups – which are performed on both active and inactive data – are designed to permit recovery from data failure.
Bearing that in mind, being able to quickly restore data from a backup medium is likely far more important than it would be for an archive. Such considerations will define the kind of ideal solutions you might choose for your data archive vs. your data backup.
The data archive: traditional considerations
- You would need to see far into the future, as the medium you are using today may not exist in ten years. It can therefore be a real challenge to identify a viable long-term storage platform.
- While archived data may not be currently active, they are generally intended for production use. Therefore, reliable security over long periods of time becomes a critical goal.
- Data archives tend to grow with time, so you will need to realistically consider future costs and scalable infrastructure needs upfront.
- Most organizations – and especially governments – are very particular about the availability of archived data. The process of meeting such expectations may lead you to improved disaster recovery strategies, but it can also be really complicated and expensive.
- Implementation can require significant skills and experience across multiple technologies.
However, many of these concerns simply wouldn’t apply to a data archive in the Cloud. Using AWS, for instance, means you never need to invest in a particular technology or medium, or worry about changing standards – that’s all Amazon’s headache. And your costs will always be a direct product of the services you actually use.
Your data archive and the AWS Cloud
Of course, AWS isn’t the only player offering out-of-the-box cloud archiving services, but they’re a good place to start.
S3 and Glacier are, one way or another, the primary AWS tools you’ll use for your data archive. We’ll look at three common use-case scenarios: archives for AWS-based data, on-premise data, and hybrid data solutions.
1. Applications deployed within AWS
If the application to be archived is running within the AWS environment, then integration with S3 or Glacier should be straightforward. Since a data archive doesn’t demand frequent reads, you would normally opt for the cheaper Glacier, which can require a lag of several hours for retrieval. If, however, you’re already storing some application data in S3 (like videos or application logs) and you may not want to write the extra code needed to move inactive data to Glacier, you may instead consider moving only the old, inactive data from S3 to Glacier.
AWS allows you to configure and manage the automated lifecycle of objects in your S3 buckets. You could therefore create a configuration that causes S3 objects to be moved to Glacier based on specified conditions or policies.
A sample policy may look like this:
2. Applications deployed on premises
If the components of your application (like a webserver, database, application server, and NFS server) are running within your datacenter, but you still want to use AWS for archiving your backed up data, the simplest solution is to integrate your backup server with AWS S3 or Glacier. This diagram may help you visualize the architecture:
If you’re already using AWS S3 for your backups instead of a local backup server, then you can use S3 Lifecycle management to quickly add a data archive layer using Glacier to your infrastructure.
Even if your backup server doesn’t natively support AWS cloud integration, you can still create a seamless and secure interface between your data center and AWS’s storage infrastructure using AWS Storage Gateway. Storage Gateway won’t require a dedicated network setup between your corporate network and AWS infrastructure, and it is built to support industry standard storage protocols, while storing the encrypted data in AWS S3.
3. Applications deployed in a hybrid setup
In this kind of setup, an application deployed on AWS might interact with on-premise components (or the other way around). In such cases, you may want to extend an existing archiving strategy to the cloud, requiring only a reliable way to connect your two networks via either a standard VPN setup or through AWS Direct Connect, which makes it easy to establish a dedicated network connection from your premises to AWS.
Data archive compliance and regulations
Many customers will have specific data retention policies, and must often comply with regulatory guidelines. AWS Glacier offers you Vault Locks. A Vault Lock Policy allows you to apply compliance controls to the contents of any Glacier vault.
To review, here are some of the key advantages you can enjoy by archiving your data in the cloud…and with AWS in particular:
- No more need to rely on risky predictions of your data growth and corresponding data storage.
- Reduced overhead of managing huge data stores for long periods.
- Reduced cost.
- Increased availability.
- No more need to identify and invest in some particular hardware and skills to implement a reliable, long-term archival design.
Do you have your own cloud/local archiving experience? Let us know in the comments.
AWS Security: Bastion Host, NAT instances and VPC Peering
Effective security requires close control over your data and resources. Bastion hosts, NAT instances, and VPC peering can help you secure your AWS infrastructure. Welcome to part four of my AWS Security overview. In part three, we looked at network security at the subnet level. This ti...
Top 13 Amazon Virtual Private Cloud (VPC) Best Practices
Amazon Virtual Private Cloud (VPC) brings a host of advantages to the table, including static private IP addresses, Elastic Network Interfaces, secure bastion host setup, DHCP options, Advanced Network Access Control, predictable internal IP ranges, VPN connectivity, movement of interna...
Big Changes to the AWS Certification Exams
With AWS re:Invent 2019 just around the corner, we can expect some early announcements to trickle through with upcoming features and services. However, AWS has just announced some big changes to their certification exams. So what’s changing and what’s new? There is a brand NEW ...
New on Cloud Academy: ITIL® 4, Microsoft 365 Tenant, Jenkins, TOGAF® 9.1, and more
At Cloud Academy, we're always striving to make improvements to our training platform. Based on your feedback, we released some new features to help make it easier for you to continue studying. These new features allow you to: Remove content from “Continue Studying” section Disc...
AWS Security Groups: Instance Level Security
Instance security requires that you fully understand AWS security groups, along with patching responsibility, key pairs, and various tenancy options. As a precursor to this post, you should have a thorough understanding of the AWS Shared Responsibility Model before moving onto discussi...
Cloud Migration Risks & Benefits
If you’re like most businesses, you already have at least one workload running in the cloud. However, that doesn’t mean that cloud migration is right for everyone. While cloud environments are generally scalable, reliable, and highly available, those won’t be the only considerations dri...
Real-Time Application Monitoring with Amazon Kinesis
Amazon Kinesis is a real-time data streaming service that makes it easy to collect, process, and analyze data so you can get quick insights and react as fast as possible to new information. With Amazon Kinesis you can ingest real-time data such as application logs, website clickstre...
Google Cloud Functions vs. AWS Lambda: The Fight for Serverless Cloud Domination
Serverless computing: What is it and why is it important? A quick background The general concept of serverless computing was introduced to the market by Amazon Web Services (AWS) around 2014 with the release of AWS Lambda. As we know, cloud computing has made it possible for users to ...
Google Vision vs. Amazon Rekognition: A Vendor-Neutral Comparison
Google Cloud Vision and Amazon Rekognition offer a broad spectrum of solutions, some of which are comparable in terms of functional details, quality, performance, and costs. This post is a fact-based comparative analysis on Google Vision vs. Amazon Rekognition and will focus on the tech...
New on Cloud Academy: CISSP, AWS, Azure, & DevOps Labs, Python for Beginners, and more…
As Hurricane Dorian intensifies, it looks like Floridians across the entire state might have to hunker down for another big one. If you've gone through a hurricane, you know that preparing for one is no joke. You'll need a survival kit with plenty of water, flashlights, batteries, and n...
Amazon Route 53: Why You Should Consider DNS Migration
What Amazon Route 53 brings to the DNS table Amazon Route 53 is a highly available and scalable Domain Name System (DNS) service offered by AWS. It is named by the TCP or UDP port 53, which is where DNS server requests are addressed. Like any DNS service, Route 53 handles domain regist...
How to Unlock Complimentary Access to Cloud Academy
Are you looking to get trained or certified on AWS, Azure, Google Cloud Platform, DevOps, Cloud Security, Python, Java, or another technical skill? Then you'll want to mark your calendars for August 23, 2019. Starting Friday at 12:00 a.m. PDT (3:00 a.m. EDT), Cloud Academy is offering c...