Ready for the real environment experience?
It is relatively simple to create powerful data pipelines in DC/OS. In this Lab, you will learn how to perform streaming data analytics by building a data pipeline in DC/OS that combines multiple services and a Twitter-like application. You will review many of the fundamental concepts in using DC/OS along the way, including installing packages, using Marathon-LB to load balance traffic, and working with virtual IPs.
Upon completion of this Lab you will be able to:
- Install DC/OS packages with custom options using the DC/OS CLI
- Deploy a data pipeline using Kafka, Cassandra, and a social networking app
- Use the Zeppelin package and DC/OS Spark to perform basic streaming analytics on the data pipeline
You should be familiar with:
- Basic and intermediate DC/OS concepts including Virtual IPs and Marathon-LB
- Working at the command-line in Linux
- AWS services to optionally understand the architecture of the pre-created DC/OS cluster
Before completing the Lab instructions, the environment will look as follows:
After completing the Lab instructions, the environment should look similar to:
October 2nd, 2020 - Replaced CoreOS virtual machines (no longer available in AWS) with CentOS
January 10th, 2019 - Added a validation Lab Step to check the work you perform in the Lab
Logan has been involved in software development and research since 2007 and has been in the cloud since 2012. He is an AWS Certified DevOps Engineer - Professional, AWS Certified Solutions Architect - Professional, Microsoft Certified Azure Solutions Architect Expert, MCSE: Cloud Platform and Infrastructure, Google Cloud Certified Associate Cloud Engineer, Certified Kubernetes Administrator (CKA), Certified Kubernetes Application Developer (CKAD), Linux Foundation Certified System Administrator (LFCS), and Certified OpenStack Administrator (COA). He earned his Ph.D. studying design automation and enjoys all things tech.