Big Data on AWS: How the Cloud Can Help You

(Update) We’ve recently uploaded new training material on Big Data using services on Amazon Web Services, Microsoft Azure, and Google Cloud Platform on the Cloud Academy Training Library. On top of that, we’ve been busy adding new content on the Cloud Academy blog on how to best train yourself and your team on Big Data.


Big Data is a term used to describe data that is too large and complex to be processed by traditional data processing techniques, instead, it requires massively parallel software running on a big number of servers, which could be in the range of hundreds or even thousands. The size of the data that can be considered “Big” is relative. What is considered Big Data today, might not be considered “Big” few years ahead: 1 GB of data was considered Big Data years ago; 1 TB (more than a thousand time bigger) is not considered to be “Big” nowadays.

According to the widely used Gartner’s definition, Big Data is mainly characterized by the 3 V’s: Volume (amount of data), Velocity (speed of data in and out), and Variety (range of data types and sources). In 2012, Gartner updated its definition for Big Data as follows: “Big data is high volume, high velocity, and/or high variety information assets that require new forms of processing to enable enhanced decision making, insight discovery, and process optimization.
Room with several computers

Big Data on AWS

Big Data processing requires huge investments in hardware and processing resources, and that creates an obstacle for small to medium-sized businesses. Cloud computing with public clouds can overcome this obstacle by providing pay-as-you-go, on-demand, and scalable cloud services for Big Data handling. Using cloud computing for Big Data will reduce the cost of hardware, reduce the cost of processing, and facilitate testing the value of Big Data before deploying expensive resources.

Amazon Web Services is the largest public cloud and is described by Gartner to be leading other public clouds by years. It provides a comprehensive set of services that enable customers to rely completely on AWS to handle their Big Data. In addition to database services, AWS makes it easy to provision computation (EC2), storage (S3), data transfer (AWS Direct Connect and Import/Export services), and archiving (Glacier) services to facilitate turning data into information useful for business.  In the rest of this article, we will shed light on AWS data services that are used to handle Big Data.

Amazon EMR

EMR is basically an Amazon-enhanced Hadoop to work seamlessly on AWS. Hadoop is an open-source software framework for distributed storage and distributed processing of Big Data on clusters of commodity hardware (in EMR it would be AWS virtual servers). Hadoop Distributed File System (HDFS) splits files into large blocks and distributes the blocks amongst the nodes in the cluster. Hadoop Map/Reduce processes the data by moving code to the nodes that have the required data, and the data will be processed in parallel on the nodes.

Hadoop clusters running on Amazon EMR use Amazon S3 for bulk storage of input and output data, and CloudWatch to monitor cluster performance and raise alarms. You can also move data into and out of DynamoDB using Amazon EMR and Hive. All of this is orchestrated by Amazon EMR control software that launches and manages the Hadoop cluster. This process is called an Amazon EMR cluster. EMR has the advantage of using the Cloud over the traditional Hadoop. Users can provision scalable clusters of virtual servers within minutes and pay for the actual use only.

EMR can also integrate and benefit from the other AWS services. Open-source projects that run on top of the Hadoop architecture can also be run on Amazon EMR.

Amazon Redshift

Amazon Redshift is Amazon’s Columnar Data Store, that is data stores arranged in columns instead of rows, enabling faster access for analytic applications. It’s a fully managed petabyte-scale data warehouse service. RedShift is designed for analytic workloads and connects to standard SQL-based clients and business intelligence tools.

According to Amazon’s website, Redshift delivers fast query and I/O performance for virtually any size dataset by using columnar storage technology and parallelizing and distributing queries across multiple nodes. Most common administrative tasks associated with provisioning, configuring, monitoring, backing up, and securing a data warehouse are automated.

DynamoDB

Amazon DynamoDB is a fully managed fast and flexible NoSQL database service for all applications that need consistent, single-digit millisecond latency at any scale. It has high availability and reliability with seamless scaling. In DynamoDB service is purchased based on throughput rather than storage. When more throughput is requested, DynamoDB will spread the data and traffic over a number of servers using solid-state drives to allow predictable performance.

DynamoDB supports both document and key-value data models and is schema-less, that is each item (row) has a primary key and any number of attributes (columns), and the primary key is the only required attribute that is needed to identify the item. In addition to query, the primary key, DynamoDB has added flexibility by querying non-primary key attributes using Global Secondary Indexes and Local Secondary Indexes. Its flexible data model and reliable performance make it a great fit for mobile, web, gaming, ad-tech, IoT, and many other applications.

Big Data is not necessarily NoSQL, Relational DB are Big too

Although the term Big Data is mainly associated with NoSQL DBs, Relational DBs can come under the definition of Big Data too. According to Amazon’s website Amazon RDS allows you to easily set up, operate, and scale a relational database in the cloud. It provides cost-efficient and resizable capacity while managing time-consuming database administration tasks, freeing you up to focus on your applications and business.

Amazon Kinesis

AWS has introduced a real-time event processing service called Amazon Kinesis. Amazon describes Kinesis as a fully managed streaming data service in which continuously various types of data such as clickstreams, application logs, and social media can be put into an Amazon Kinesis stream from hundreds of thousands of sources. Within seconds, the data will be available for an Amazon Kinesis Application to read and process from the stream. Amazon Kinesis stream consists of Shards receiving data for the producer application. Shard is the basic unit of Kinesis streams which can support 1 MB of data written per second, and 2 MB of data read per second. The consumer applications take the data from the Kinesis stream and do whatever processing required.

By looking at the services provided by Amazon to handle Big Data, AWS has a complete set that covers all needs for Big Data processing, storage, and transfer. AWS covers the full spectrum of Big Data technologies: Hadoop and Map Reduce (EMR), Relational DBs (RDS), NoSQL DBs (DynamoDB), Columnar Data Stores (RedShift), and Stream Processing (Kinesis). In addition to that, Amazon facilitated connecting these services with each other, and with other services on AWS, and that creates unrivaled flexibility and capabilities for Big Data.

Avatar

Written by

Motasem Aldiab

Motasem Aldiab is a professor, consultant, trainer, and developer. Dr. Aldiab has got his PhD in Computer Engineering from QUB in 2008. He is a certified trainer for the Cloud School and SOA School. He has been training and offering consultations for years in Java, SOA, and Cloud Computing, and leading workshops and training session (virtual or instructor led).


Related Posts

Alisha Reyes
Alisha Reyes
— December 5, 2019

New on Cloud Academy: AWS Solution Architect Lab Challenge, Azure Hands-on Labs, Foundation Certificate in Cyber Security, and Much More

Now that Thanksgiving is over and the craziness of Black Friday has died down, it's now time for the busiest season of the year. Whether you're a last-minute shopper or you already have your shopping done, the holidays bring so much more excitement than any other time of year. Since our...

Read more
  • AWS
  • AWS solution architect
  • AZ-203
  • Azure
  • cyber security
  • FCCS
  • Foundation Certificate in Cyber Security
  • Google Cloud Platform
  • Kubernetes
Avatar
Cloud Academy Team
— December 4, 2019

Understanding Enterprise Cloud Migration

What is enterprise cloud migration? Cloud migration is about moving your data, applications, and even infrastructure from your on-premises computers or infrastructure to a virtual pool of on-demand, shared resources that offer compute, storage, and network services at scale. Why d...

Read more
  • AWS
  • Azure
  • Data Migration
Wendy Dessler
Wendy Dessler
— November 27, 2019

6 Reasons Why You Should Get an AWS Certification This Year

In the past decade, the rise of cloud computing has been undeniable. Businesses of all sizes are moving their infrastructure and applications to the cloud. This is partly because the cloud allows businesses and their employees to access important information from just about anywhere. ...

Read more
  • AWS
  • Certifications
  • certified
Avatar
Andrea Colangelo
— November 26, 2019

AWS Regions and Availability Zones: The Simplest Explanation You Will Ever Find Around

The basics of AWS Regions and Availability Zones We’re going to treat this article as a sort of AWS 101 — it’ll be a quick primer on AWS Regions and Availability Zones that will be useful for understanding the basics of how AWS infrastructure is organized. We’ll define each section,...

Read more
  • AWS
Avatar
Dzenan Dzevlan
— November 20, 2019

Application Load Balancer vs. Classic Load Balancer

What is an Elastic Load Balancer? This post covers basics of what an Elastic Load Balancer is, and two of its examples: Application Load Balancers and Classic Load Balancers. For additional information — including a comparison that explains Network Load Balancers — check out our post o...

Read more
  • ALB
  • Application Load Balancer
  • AWS
  • Elastic Load Balancer
  • ELB
Albert Qian
Albert Qian
— November 13, 2019

Advantages and Disadvantages of Microservices Architecture

What are microservices? Let's start our discussion by setting a foundation of what microservices are. Microservices are a way of breaking large software projects into loosely coupled modules, which communicate with each other through simple Application Programming Interfaces (APIs). ...

Read more
  • AWS
  • Docker
  • Kubernetes
  • Microservices
Nisar Ahmad
Nisar Ahmad
— November 12, 2019

Kubernetes Services: AWS vs. Azure vs. Google Cloud

Kubernetes is a popular open-source container orchestration platform that allows us to deploy and manage multi-container applications at scale. Businesses are rapidly adopting this revolutionary technology to modernize their applications. Cloud service providers — such as Amazon Web Ser...

Read more
  • AWS
  • Azure
  • Google Cloud
  • Kubernetes
Avatar
Stuart Scott
— October 31, 2019

AWS Internet of Things (IoT): The 3 Services You Need to Know

The Internet of Things (IoT) embeds technology into any physical thing to enable never-before-seen levels of connectivity. IoT is revolutionizing industries and creating many new market opportunities. Cloud services play an important role in enabling deployment of IoT solutions that min...

Read more
  • AWS
  • AWS IoT Events
  • AWS IoT SiteWise
  • AWS IoT Things Graph
  • IoT
Avatar
Cloud Academy Team
— October 23, 2019

Which Certifications Should I Get?

As we mentioned in an earlier post, the old AWS slogan, “Cloud is the new normal” is indeed a reality today. Really, cloud has been the new normal for a while now and getting credentials has become an increasingly effective way to quickly showcase your abilities to recruiters and compan...

Read more
  • AWS
  • Azure
  • Certifications
  • Cloud Computing
  • Google Cloud Platform
Valery Calderón Briz
Valery Calderón Briz
— October 22, 2019

How to Go Serverless Like a Pro

So, no servers? Yeah, I checked and there are definitely no servers. Well...the cloud service providers do need servers to host and run the code, but we don’t have to worry about it. Which operating system to use, how and when to run the instances, the scalability, and all the arch...

Read more
  • AWS
  • Lambda
  • Serverless
Avatar
Stuart Scott
— October 16, 2019

AWS Security: Bastion Hosts, NAT instances and VPC Peering

Effective security requires close control over your data and resources. Bastion hosts, NAT instances, and VPC peering can help you secure your AWS infrastructure. Welcome to part four of my AWS Security overview. In part three, we looked at network security at the subnet level. This ti...

Read more
  • AWS
Avatar
Sudhi Seshachala
— October 9, 2019

Top 13 Amazon Virtual Private Cloud (VPC) Best Practices

Amazon Virtual Private Cloud (VPC) brings a host of advantages to the table, including static private IP addresses, Elastic Network Interfaces, secure bastion host setup, DHCP options, Advanced Network Access Control, predictable internal IP ranges, VPN connectivity, movement of interna...

Read more
  • AWS
  • best practices
  • VPC