This learning path prepares you for the 3-hour AWS Big Data Specialty Certification exam and provides you an in-depth understanding of AWS Big Data services available and how to use those AWS services together to create big data solutions.
We cover the six domains of the big data specialty exam outline with courses, labs, and quizzes.
For domain one we explain the various data collection methods and techniques for determining the operational characteristics of a collection system. We explore how to define a collection system able to handle the frequency of data change and the type of data being ingested. We identify how to enforce data properties such as order, data structure, and metadata, and to ensure the durability and availability for our collection approach.
Domain two of the Big Data Specialty learning path focuses on storage. In this group of courses, we outline the key storage options for big data solutions. We determine data access and retrieval patterns, and some of the use cases that suit particular data patterns such as evaluating mechanisms for capture, update, and retrieval of catalog entries. We learn how to determine appropriate data structure and storage formats, and how to determine and optimize the operational characteristics of a Big Data storage solution.
In domain three of the Big Data Specialty learning path, we learn how to identify the appropriate data processing technologies needed for big data scenarios. We explore how to design and architect a data processing solution, and explore and define the operational characteristics of big data processing. We delve into the various processing services available focusing on Amazon Kinesis, Elastic Map Reduce and Amazon Rekognition.
For domain four of the Big Data Specialty learning path, we learn how to determine the tools and techniques required for data analysis. We explore how to design and architect an analytical solution, and how to optimize the operational characteristics of the Analysis System using tools such as Amazon Athena and Kinesis.
In domain five we learn how to determine the appropriate techniques for delivering the results/output of a query or analysis. We examine how to design and create a visualization platform using AWS services, and how to optimize visualization services to present results in an effective and accessible manner using Amazon Quicksight.
Data security comprises 20% of the certification curriculum so it is important students have a thorough understanding of security best practices for Big Data solutions. In this course, we examine how to determine encryption requirements and how to implement encryption services. We examine how to choose the appropriate technology to enforce data governance, and Identify how to ensure data integrity while working with Big Data solutions.
- [Instructor] Hi, Cloud Academy ninjas. Welcome to the wonderful world of big data. Now this learning path prepares you for the AWS Big Data Specialty Certification. Now it's a three hour professional level certification exam. So for this certification, we need to have an in-depth understanding of big data options that AWS provides. And most importantly, we need to know when to use each of those services.We cover the six domains of the big data specialty exam outline with courses, labs, and quizzes.
My name is Andrew Larkin. I'm the AWS Lead at Cloud Academy, and we've assembled a rockstar cast to help you get ready for your certification exam. So we start with an introduction to analytics and database fundamentals. We need to learn more relevant detail on big data collection, storage, processing, analysis, visualization, and security. So first off, Shane Gibson, a true big data specialist with many years experience working on real world big data solutions, takes us through the core domains of collection and storage.
Next, Ryan Park, the Storage Operations Manager at Slack.com and a DynamoDB ninja takes us on a deep dive into DynamoDB. We look at how to use indexes, how to troubleshoot performance issues, and how to use DynamoDB in a big data scenario. Then, AWS specialist and many times certified AWS ninja, Jeremy Cook, leads us through the processing domain. We go deep into Redshift and Elastic MapReduce. We look at cluster migrations, performance issues, tuning, data ingestion. We look at supported software like Presto, Hive, Zeppelin, Jupyter, Spark, Spark Streaming, all those great things. With the big data specialty cert, Kinesis features a lot. So when to use it, how to use it, and why you would use it. We start with an introduction to Kinesis from Richard. And then Jeremy takes us into the more advanced aspects of Kinesis ingestion, transformation, and data processing.
Jeremy then delves deep into Amazon Athena and how we can use this great tool to query and analyze various data sets. Then James introduces us to Amazon Machine Learning. We need some knowledge of the algorithms that are supported by Amazon Machine Learning, and we get our hands dirty with an interesting way of using Amazon Rekognition. Then we learn all about Quicksight and how to visualize data in an analytical scenario. Then, to close out, Stuart Scott, our AWS ninja and security specialist, takes us through an encryption deep dive to ensure we are prepared for the security aspects of the big data certification. With Cloud Academy, you are in charge. So remember, you can change the speed of the lectures to suit you, you can go faster or slower, or skip around the curriculum if you think it suits you better. Whatever works best. The labs give you hands-on experience, so do make sure you do them, alright? Okay, ninjas, if you are ready, let's get started in the wonderful world of big data services on AWS.
Andrew is fanatical about helping business teams gain the maximum ROI possible from adopting, using, and optimizing Public Cloud Services. Having built 70+ Cloud Academy courses, Andrew has helped over 50,000 students master cloud computing by sharing the skills and experiences he gained during 20+ years leading digital teams in code and consulting. Before joining Cloud Academy, Andrew worked for AWS and for AWS technology partners Ooyala and Adobe.