Harnessing the Power of Big Data Analysis on AWS

Like a jigsaw puzzle, there are many components in the AWS big data ecosystem. Read this article and see how the components fit together to form a beautiful whole.

If you are a data engineer, wouldn’t it be great if you could easily scale your existing infrastructure on-demand to support your real-time data pipelines?

If you are a data scientist, wouldn’t it be great if you could leverage massive clusters of computers to conduct analyses on large datasets and build statistical models quickly?

Or, maybe you are playing both roles. Perhaps you are a student and you are trying to find a platform to get started in big data quickly. Whatever your current role might be, be sure to read on to learn more about AWS’s big data capabilities provided by the following services: EMR, DynamoDB, Redshift, Data Pipeline, Kinesis Streams, Machine Learning, Elasticsearch, and Kinesis Firehose. That’s a lot of ground to cover, so let’s get started!

Need a big data platform? Look no further than Amazon Web Services!

AWS is one of the largest and fastest-growing cloud infrastructure providers in the world. Over the past ten years, AWS has developed more than 70 products and services to support different and often very demanding types of cloud computing use cases.

The company dominates the global cloud industry with its market share of more than 30%. Even though Microsoft Azure and Google Cloud Platform are growing rapidly, AWS still witnessed a significant growth of 63% year over year. The brand equity that AWS has built over the years has allowed the company to gain a strong foothold in the increasingly-competitive cloud computing landscape.
AWS has been embracing the big data movement as part of their Infrastructure-as-a-Service platform strategy beginning as early as 2008 — just two years after they were officially launched as a subsidiary of Amazon.com.

In 2008, AWS announced public access to huge data sets such as the map of the human genome and the US census

It all began with an announcement to make available certain large public datasets such as the mapping of the Human Genome and the US Census data on Amazon S3 so that anyone who wants to analyze these datasets can do so easily and quickly by accessing them directly from their Amazon Ec2 instances.
Just to get an idea of how large some of these datasets are, the 1000 Genomes Project that tries to build the most comprehensive catalog of human genetic variation is over 200TB. The 1980, 1990 and 2000 US Census data are approximately 5GB, 50GB and 200GB respectively. AWS is able to leverage their cloud computing infrastructure to host these datasets with the hope of spurring innovation.
Shortly after they introduced these large public datasets, AWS started supporting big data processing by introducing the Amazon Elastic MapReduce (EMR) service in April 2009. Amazon EC2 and S3 are the basic building blocks that make Amazon EMR possible. AWS made incremental improvements and added new features to EMR, and also introduced other big data-related services over the years, which I have summarized chronologically in the following list:

Let’s take a look at the constituent services of the AWS big data infrastructure-as-a-service.

Amazon EMR: Elastic MapReduce Hadoop Service

Amazon EMR is a service that lets you create and manage large-scale distributed data processing clusters easily. Data engineers know that setting up different components in the Hadoop ecosystem and getting them to work well together often requires a lot of effort. Amazon EMR takes care of this burden and lets you focus on the dataset instead.

Amazon EMR supports Hadoop and Spark, along with interactive notebooks like Hue and Zeppelin-Sandbox, as well as machine learning frameworks like Mahout and Spark Mllib. For the full list of supported applications, see the Amazon EMR release page.

I have written a two-part series about using Apache Spark and Apache Zeppelin on Amazon EMR – See Part 1 and Part 2. Check them out. You may find my notes on IAM helpful too.

Cloud Academy offers quizzes around Amazon EMR and you can try them out (and all the Cloud Academy resources) for free.
Big Data AWS

Amazon DynamoDB: Managed NoSQL Databases

Amazon DynamoDB is a fully managed NoSQL database service that you can use to import, persist, and extract schema-less data. You can create DynamoDB tables and populate them with data from CSV files. You can also export data from DynamoDB tables to Amazon S3 buckets as a backup.
Our Database Fundamentals for AWS course gives you an introduction to Amazon DynamoDB. We also have a dedicated course focusing on Working with DynamoDB. At the same time, you should also try our hands-on lab to create and query DynamoDB tables.
amazon courses at Cloud Academy

Amazon Redshift: Industrial-Grade Data Warehousing

Amazon Redshift is a fully managed petabyte-scale data warehouse service. One petabyte is approximately 1,000 terabytes (in case you were going to look it up). You can create and provision an Amazon Redshift cluster to store large amounts of data, and perform fast queries using SQL query tools like SQL Workbench/J or Re:dash. You can also connect your business intelligence application or any other applications as long as it supports the standard PostgresSQL JDBC or OCBC drivers.
We have an introductory blog post, and as you might imagine some detailed learning resources if you would like to find out more.

AWS Data Pipeline: Data Processing Workflow Automation

AWS Data Pipeline lets you define a workflow to automate the processing and moving of data from one AWS service to another on a regular basis. You can create a pipeline to launch EMR jobs that run Hive queries with data imported from Amazon RDS, or backup DynamoDB tables into Amazon S3 every end of business day, or import new data when it is available into Amazon Redshift and send an Amazon SNS notification that new data has been imported into Redshift.
Our Automated Data Management with EBS, S3, and Glacier course shows you the best methods of backing up your data resources using AWS Data Pipeline.

Amazon Kinesis Streams: Streaming Data Service

Amazon Kinesis Streams is a managed service to ingest streaming data from many different sources. It makes the data available to Amazon Kinesis Applications that would read and process the streaming data in real-time. These applications are data consumers that are developed with the Amazon Kinesis API or Amazon Kinesis Client Library (KCL). There is a pre-built library that you can use to integrate Amazon Kinesis Streams with Apache Storm. You can think of this as the AWS’s equivalent of Apache Kafka.
Our blog post from 2015 titled Amazon Kinesis: Managed Real-time Event Processing is a good starting point and I encourage you to review posts from different sources.

Amazon Machine Learning

Amazon Machine Learning is a service that makes it easy for anyone to create machine learning models to do things like classifications or predictions through the use of wizards. You do not need to have a strong grasp of machine learning to use this service, although you are strongly advised to understand what you are doing.

Try Amazon Machine Learning with our hands-on lab using the HAR (Human Activity Recognition) dataset. We also have a course that introduces you to the principles and practice of Amazon Machine Learning.

Amazon Elasticsearch: Distributed Search Engine

Elasticsearch is an open source distributed search and analytics engine that you can use to run full-text queries on large amounts of data to make sense of the data. Amazon Elasticsearch is a managed service that lets you launch Elasticsearch on AWS.

We have a couple of blog posts about our impression of Amazon Elasticsearch when it was first launched, and our comparison with Amazon CloudSearch.

Amazon Kinesis Firehose

Amazon Kinesis Firehose is a managed service that lets you create delivery streams to send streaming data to AWS services such as Amazon S3, Amazon Redshift or Amazon Elasticsearch. Amazon Kinesis Firehose simply reads and writes data, and does not do any processing to the data stream.

Conclusion

As you probably gathered from reading this post, big data isn’t confined to a single area. I included links to useful posts and other articles. Researching this article forced me to see AWS Big Data with fresh eyes. AWS never stands still and that dynamic progress inspires me.

I hope that this blog post inspires you to explore and learn more about the different big data options that are available on AWS. Check out our Analytics Fundamentals for AWS course. The course covers numerous analytics tools including Amazon EMR, Kinesis Streams and Firehose, Machine Learning, Data Pipeline and Elasticsearch.

Have any questions? Leave a comment below!

 

Avatar

Written by

Eugene Teo

Eugene Teo is a director of security at a US-based technology company. He is interested in applying machine learning techniques to solve problems in the security domain.


Related Posts

Avatar
Stuart Scott
— October 16, 2019

AWS Security: Bastion Host, NAT instances and VPC Peering

Effective security requires close control over your data and resources. Bastion hosts, NAT instances, and VPC peering can help you secure your AWS infrastructure. Welcome to part four of my AWS Security overview. In part three, we looked at network security at the subnet level. This ti...

Read more
  • AWS
Avatar
Sudhi Seshachala
— October 9, 2019

Top 13 Amazon Virtual Private Cloud (VPC) Best Practices

Amazon Virtual Private Cloud (VPC) brings a host of advantages to the table, including static private IP addresses, Elastic Network Interfaces, secure bastion host setup, DHCP options, Advanced Network Access Control, predictable internal IP ranges, VPN connectivity, movement of interna...

Read more
  • AWS
  • best practices
  • VPC
Avatar
Stuart Scott
— October 2, 2019

Big Changes to the AWS Certification Exams

With AWS re:Invent 2019 just around the corner, we can expect some early announcements to trickle through with upcoming features and services. However, AWS has just announced some big changes to their certification exams. So what’s changing and what’s new? There is a brand NEW ...

Read more
  • AWS
  • Certifications
Alisha Reyes
Alisha Reyes
— October 1, 2019

New on Cloud Academy: ITIL® 4, Microsoft 365 Tenant, Jenkins, TOGAF® 9.1, and more

At Cloud Academy, we're always striving to make improvements to our training platform. Based on your feedback, we released some new features to help make it easier for you to continue studying. These new features allow you to: Remove content from “Continue Studying” section Disc...

Read more
  • AWS
  • Azure
  • Google Cloud Platform
  • ITIL® 4
  • Jenkins
  • Microsoft 365 Tenant
  • New content
  • Product Feature
  • Python programming
  • TOGAF® 9.1
Avatar
Stuart Scott
— September 27, 2019

AWS Security Groups: Instance Level Security

Instance security requires that you fully understand AWS security groups, along with patching responsibility, key pairs, and various tenancy options. As a precursor to this post, you should have a thorough understanding of the AWS Shared Responsibility Model before moving onto discussi...

Read more
  • AWS
  • instance security
  • Security
  • security groups
Avatar
Jeremy Cook
— September 17, 2019

Cloud Migration Risks & Benefits

If you’re like most businesses, you already have at least one workload running in the cloud. However, that doesn’t mean that cloud migration is right for everyone. While cloud environments are generally scalable, reliable, and highly available, those won’t be the only considerations dri...

Read more
  • AWS
  • Azure
  • Cloud Migration
Joe Nemer
Joe Nemer
— September 12, 2019

Real-Time Application Monitoring with Amazon Kinesis

Amazon Kinesis is a real-time data streaming service that makes it easy to collect, process, and analyze data so you can get quick insights and react as fast as possible to new information.  With Amazon Kinesis you can ingest real-time data such as application logs, website clickstre...

Read more
  • amazon kinesis
  • AWS
  • Stream Analytics
  • Streaming data
Joe Nemer
Joe Nemer
— September 6, 2019

Google Cloud Functions vs. AWS Lambda: The Fight for Serverless Cloud Domination

Serverless computing: What is it and why is it important? A quick background The general concept of serverless computing was introduced to the market by Amazon Web Services (AWS) around 2014 with the release of AWS Lambda. As we know, cloud computing has made it possible for users to ...

Read more
  • AWS
  • Azure
  • Google Cloud Platform
Joe Nemer
Joe Nemer
— September 3, 2019

Google Vision vs. Amazon Rekognition: A Vendor-Neutral Comparison

Google Cloud Vision and Amazon Rekognition offer a broad spectrum of solutions, some of which are comparable in terms of functional details, quality, performance, and costs. This post is a fact-based comparative analysis on Google Vision vs. Amazon Rekognition and will focus on the tech...

Read more
  • Amazon Rekognition
  • AWS
  • Google Cloud Platform
  • Google Vision
Alisha Reyes
Alisha Reyes
— August 30, 2019

New on Cloud Academy: CISSP, AWS, Azure, & DevOps Labs, Python for Beginners, and more…

As Hurricane Dorian intensifies, it looks like Floridians across the entire state might have to hunker down for another big one. If you've gone through a hurricane, you know that preparing for one is no joke. You'll need a survival kit with plenty of water, flashlights, batteries, and n...

Read more
  • AWS
  • Azure
  • Google Cloud Platform
  • New content
  • Product Feature
  • Python programming
Joe Nemer
Joe Nemer
— August 27, 2019

Amazon Route 53: Why You Should Consider DNS Migration

What Amazon Route 53 brings to the DNS table Amazon Route 53 is a highly available and scalable Domain Name System (DNS) service offered by AWS. It is named by the TCP or UDP port 53, which is where DNS server requests are addressed. Like any DNS service, Route 53 handles domain regist...

Read more
  • Amazon
  • AWS
  • Cloud Migration
  • DNS
  • Route 53
Alisha Reyes
Alisha Reyes
— August 22, 2019

How to Unlock Complimentary Access to Cloud Academy

Are you looking to get trained or certified on AWS, Azure, Google Cloud Platform, DevOps, Cloud Security, Python, Java, or another technical skill? Then you'll want to mark your calendars for August 23, 2019. Starting Friday at 12:00 a.m. PDT (3:00 a.m. EDT), Cloud Academy is offering c...

Read more
  • AWS
  • Azure
  • cloud academy content
  • complimentary access
  • GCP
  • on the house