Build and Packaging
Test and Validation
The course is part of this learning path
This course is designed to help you master the skills of designing and building cloud-native applications.
Observe first hand the end-to-end process of building a sample cloud-native application using React, Go, MongoDB, and Docker. By taking this course you'll not only get to see firsthand the skills required to create a robust, enterprise-grade, cloud-native application, but you'll also be able to apply them yourself as all code and deployment assets are available for you to perform your own deployment:
What you'll learn:
- Understand the basic principles of building cloud-native applications
- Understand the benefits of using React for frontend web development
- Understand the benefits of using Go for backend API development
- Understand the benefits of using MongoDB as a database system
- And finally, you’ll learn how to package and run microservices as lightweight containers using Docker
This training course provides you with several hands-on demonstrations where you will observe first hand how to
- Build a React-based web frontend
- Build a Go-based API
- Deploy and configure a MongoDB database
- Deploy the full end-to-end application to Docker running locally
- A basic understanding of web-based software development
- Previous exposure to containers and containerization - in particular, docker
- Anyone interested in learning how to architect cloud-native applications
- Anyone interested in using modern development tools such as React, Go, MongoDB and Docker
- Anyone interested in containerization
- DevOps Practitioners
- [Instructor] Okay, welcome back. In this lecture, I'll demonstrate how to set up and create a custom dedicated docker network, onto which we'll then launch the front-end API and Mongo DB containers that are required to run our full cloud-native application.
For starters, let's examine the current set of docker networks available by running the command docker network list. Here we can see the three existing networks. Let's now create a new network named Cloud Native Demo for our demonstration like so.
If we again relist the current available networks, we can now see our new cloud native demo network, created by the previous command. Let's now inspect the network details for the cloud native demo network. We can do so by running the command docker network inspect cloud native demo. Here we can see that within in IPAM section, short for IP Address Management, that our new network is designed with a subnet range of 172.23.0.0/16. Together with a gateway of 172.23.0.1.
Okay moving on, let's clear the terminal and prepare ourselves to start launching our docker containers. We'll run the command docker PS to see if there are any existing containers running locally and as we can see, there aren't any, which is good. In terms of launch sequence, I'll launch the Mongo DB database container first, followed by the API and then finish with the front end.
Okay, for the Mongo DB database container, I'll jump into my browser and find the official mongo image hosted on Docker Hub like so. Then back within the terminal, I'll pull down this image locally by running the copied command. Okay, that looks good, we have the image locally. Lets now fire up the mongo DB docker container and give it the name mongo and expose it on the default mongo port 27017. Now that was quick, this is one of the key advantages when working with microservice enabling technologies such as docker. Next in line is the API container. We'll kick this off by giving it the name API and exposing it on port 8080. Notice the past and mongo connection string variable, this is set to override the default connection string. Within this connection string, we are referencing the mongo container by its given name, mongo.
Great, the API is now up and running. Finally let's launch the front end container with the name front end and expose it on port 80 like so. Okay, that looks great. All three microservice containers are now up and running. This can be confirmed by rerunning the docker PS command. Here we can see that the front end, the API and the mongo containers are all running. Let's clear the terminal again and take a closer look at the logs associated with each of the containers. This is useful for troubleshooting purposes if required. Starting with the front end, you can see that nothing has been logged our yet. Next, let's examine the API logs. Here we can see that the API service has reported successfully connecting to the back end mongo container as per the given connection string previously highlighted.
Next, lets examine the mongo logs. Here we can see a lot of activity. Taking a closer look at some of the logging, we can see some interesting details. We can see that it has been configured to listen to traffic originating from anywhere as per the currently highlighted message and that it is waiting for connections on port 27017. Further down we can see that a connection has been accepted from the IP address 172.23.0.3. Now I suspect that this is the API container. Let's try and confirm this.
I'll clear the terminal and then execute the following docker command to display the IP address that has been currently assigned to the API container. And indeed the IP Address matches. So this proves conclusively that the API container has successfully connected to the mongo DB container. Since the API is now wired up successfully to the database, let's try firing some curl requests at it.
For starters I'll try out the health check okay endpoint like so. Here we can see that it has responded successfully with the okay message, implying it is healthy. Let's now expand this test by hitting the languages API endpoint. And full net the response with JQ. Here we can see that we got a response back but that it is empty. Now this is expected since we have yet to create and populate the mongo DB database. Let's do that now.
Jeremy is the DevOps Content Lead at Cloud Academy where he specializes in developing technical training documentation for DevOps.
He has a strong background in software engineering, and has been coding with various languages, frameworks, and systems for the past 20+ years. In recent times, Jeremy has been focused on DevOps, Cloud, Security, and Machine Learning.
Jeremy holds professional certifications for both the AWS and GCP cloud platforms.