(Update) We’ve recently uploaded new training material on Big Data using services on Amazon Web Services, Microsoft Azure, and Google Cloud Platform on the Cloud Academy Training Library. On top of that, we’ve been busy adding new content on the Cloud Academy blog on how to best train yourself and your team on Big Data.
Big Data is a term used to describe data that is too large and complex to be processed by traditional data processing techniques, instead, it requires massively parallel software running on a big number of servers, which could be in the range of hundreds or even thousands. The size of the data that can be considered “Big” is relative. What is considered Big Data today, might not be considered “Big” few years ahead: 1 GB of data was considered Big Data years ago; 1 TB (more than a thousand time bigger) is not considered to be “Big” nowadays.
According to the widely used Gartner’s definition, Big Data is mainly characterized by the 3 V’s: Volume (amount of data), Velocity (speed of data in and out), and Variety (range of data types and sources). In 2012, Gartner updated its definition for Big Data as follows: “Big data is high volume, high velocity, and/or high variety information assets that require new forms of processing to enable enhanced decision making, insight discovery, and process optimization.”
Big Data on AWS
Big Data processing requires huge investments in hardware and processing resources, and that creates an obstacle for small to medium-sized businesses. Cloud computing with public clouds can overcome this obstacle by providing pay-as-you-go, on-demand, and scalable cloud services for Big Data handling. Using cloud computing for Big Data will reduce the cost of hardware, reduce the cost of processing, and facilitate testing the value of Big Data before deploying expensive resources.
Amazon Web Services is the largest public cloud and is described by Gartner to be leading other public clouds by years. It provides a comprehensive set of services that enable customers to rely completely on AWS to handle their Big Data. In addition to database services, AWS makes it easy to provision computation (EC2), storage (S3), data transfer (AWS Direct Connect and Import/Export services), and archiving (Glacier) services to facilitate turning data into information useful for business. In the rest of this article, we will shed light on AWS data services that are used to handle Big Data.
EMR is basically an Amazon-enhanced Hadoop to work seamlessly on AWS. Hadoop is an open-source software framework for distributed storage and distributed processing of Big Data on clusters of commodity hardware (in EMR it would be AWS virtual servers). Hadoop Distributed File System (HDFS) splits files into large blocks and distributes the blocks amongst the nodes in the cluster. Hadoop Map/Reduce processes the data by moving code to the nodes that have the required data, and the data will be processed in parallel on the nodes.
Hadoop clusters running on Amazon EMR use Amazon S3 for bulk storage of input and output data, and CloudWatch to monitor cluster performance and raise alarms. You can also move data into and out of DynamoDB using Amazon EMR and Hive. All of this is orchestrated by Amazon EMR control software that launches and manages the Hadoop cluster. This process is called an Amazon EMR cluster. EMR has the advantage of using the Cloud over the traditional Hadoop. Users can provision scalable clusters of virtual servers within minutes and pay for the actual use only.
EMR can also integrate and benefit from the other AWS services. Open-source projects that run on top of the Hadoop architecture can also be run on Amazon EMR.
Amazon Redshift is Amazon’s Columnar Data Store, that is data stores arranged in columns instead of rows, enabling faster access for analytic applications. It’s a fully managed petabyte-scale data warehouse service. RedShift is designed for analytic workloads and connects to standard SQL-based clients and business intelligence tools.
According to Amazon’s website, Redshift delivers fast query and I/O performance for virtually any size dataset by using columnar storage technology and parallelizing and distributing queries across multiple nodes. Most common administrative tasks associated with provisioning, configuring, monitoring, backing up, and securing a data warehouse are automated.
Amazon DynamoDB is a fully managed fast and flexible NoSQL database service for all applications that need consistent, single-digit millisecond latency at any scale. It has high availability and reliability with seamless scaling. In DynamoDB service is purchased based on throughput rather than storage. When more throughput is requested, DynamoDB will spread the data and traffic over a number of servers using solid-state drives to allow predictable performance.
DynamoDB supports both document and key-value data models and is schema-less, that is each item (row) has a primary key and any number of attributes (columns), and the primary key is the only required attribute that is needed to identify the item. In addition to query, the primary key, DynamoDB has added flexibility by querying non-primary key attributes using Global Secondary Indexes and Local Secondary Indexes. Its flexible data model and reliable performance make it a great fit for mobile, web, gaming, ad-tech, IoT, and many other applications.
Big Data is not necessarily NoSQL, Relational DB are Big too
Although the term Big Data is mainly associated with NoSQL DBs, Relational DBs can come under the definition of Big Data too. According to Amazon’s website Amazon RDS allows you to easily set up, operate, and scale a relational database in the cloud. It provides cost-efficient and resizable capacity while managing time-consuming database administration tasks, freeing you up to focus on your applications and business.
AWS has introduced a real-time event processing service called Amazon Kinesis. Amazon describes Kinesis as a fully managed streaming data service in which continuously various types of data such as clickstreams, application logs, and social media can be put into an Amazon Kinesis stream from hundreds of thousands of sources. Within seconds, the data will be available for an Amazon Kinesis Application to read and process from the stream. Amazon Kinesis stream consists of Shards receiving data for the producer application. Shard is the basic unit of Kinesis streams which can support 1 MB of data written per second, and 2 MB of data read per second. The consumer applications take the data from the Kinesis stream and do whatever processing required.
By looking at the services provided by Amazon to handle Big Data, AWS has a complete set that covers all needs for Big Data processing, storage, and transfer. AWS covers the full spectrum of Big Data technologies: Hadoop and Map Reduce (EMR), Relational DBs (RDS), NoSQL DBs (DynamoDB), Columnar Data Stores (RedShift), and Stream Processing (Kinesis). In addition to that, Amazon facilitated connecting these services with each other, and with other services on AWS, and that creates unrivaled flexibility and capabilities for Big Data.
New Content: Platforms, Programming, and DevOps – Something for Everyone
This month our team of expert certification specialists released three new or updated learning paths, 16 courses, 13 hands-on labs, and four lab challenges! New content on Cloud Academy You can always visit our Content Roadmap to see what’s just released as well as what’s coming soon....
Mastering AWS Organizations Service Control Policies
Service Control Policies (SCPs) are IAM-like policies to manage permissions in AWS Organizations. SCPs restrict the actions allowed for accounts within the organization making each one of them compliant with your guidelines. SCPs are not meant to grant permissions; you should consider ...
New Content: Focus on DevOps and Programming Content this Month
This month our team of expert certification specialists released 12 new or updated learning paths, 15 courses, 25 hands-on labs, and four lab challenges! New content on Cloud Academy You can always visit our Content Roadmap to see what’s just released as well as what’s coming soon. Ja...
New Content: Get Ready for the CISM Cert Exam & Learn About Alibaba, Plus All the AWS, GCP, and Azure Courses You Know You Can Count On
This month our team of intrepid certification specialists released five learning paths, seven courses, 19 hands-on labs, and three lab challenges! One particularly interesting new learning path is Certified Information Security Manager (CISM) Foundations. After completing this learn...
Which Certifications Should I Get?
The old AWS slogan, “Cloud is the new normal” is indeed a reality today. Really, cloud has been the new normal for a while now and getting credentials has become an increasingly effective way to quickly showcase your abilities to recruiters and companies. With all that in mind, the s...
The 12 AWS Certifications: Which is Right for You and Your Team?
As companies increasingly shift workloads to the public cloud, cloud computing has moved from a nice-to-have to a core competency in the enterprise. This shift requires a new set of skills to design, deploy, and manage applications in cloud computing. As the market leader and most ma...
AWS Certified Solutions Architect Associate: A Study Guide
Want to take a really impactful step in your technical career? Explore the AWS Solutions Architect Associate certificate. Its new version (SAA-C02) was released on March 23, 2020. The AWS Solutions Architect - Associate Certification (or Sol Arch Associate for short) offers some ...
New Content: AWS Terraform, Java Programming Lab Challenges, Azure DP-900 & DP-300 Certification Exam Prep, Plus Plenty More Amazon, Google, Microsoft, and Big Data Courses
This month our Content Team continues building the catalog of courses for everyone learning about AWS, GCP, and Microsoft Azure. In addition, this month’s updates include several Java programming lab challenges and a couple of courses on big data. In total, we released five new learning...
Where Should You Be Focusing Your AWS Security Efforts?
Another day, another re:Invent session! This time I listened to Stephen Schmidt’s session, “AWS Security: Where we've been, where we're going.” Amongst covering the highlights of AWS security during 2020, a number of newly added AWS features/services were discussed, including: AWS Audit...
AWS re:Invent: 2020 Keynote Top Highlights and More
We’ve gotten through the first five days of the special all-virtual 2020 edition of AWS re:Invent. It’s always a really exciting time for practitioners in the field to see what features and services AWS has cooked up for the year ahead. This year’s conference is a marathon and not a...
WARNING: Great Cloud Content Ahead
At Cloud Academy, content is at the heart of what we do. We work with the world’s leading cloud and operations teams to develop video courses and learning paths that accelerate teams and drive digital transformation. First and foremost, we listen to our customers’ needs and we stay ahea...
Excelling in AWS, Azure, and Beyond – How Danut Prisacaru Prepares for the Future
Meet Danut Prisacaru. Danut has been a Software Architect for the past 10 years and has been involved in Software Engineering for 30 years. He’s passionate about software and learning, and jokes that coding is basically the only thing he can do well (!). We think his enthusiasm shines t...