1. Home
  2. Training Library
  3. Amazon Web Services
  4. Courses
  5. .Net Microservices - Build Deployment and Hosting - Course Two

Automated Container Builds

play-arrow
Start course
Overview
DifficultyAdvanced
Duration2h 11m
Students81

Description

Introduction
In this advanced course we take a legacy monolithic .Net application and re-architect it to use a combination of cloud services to increase, scalability, performance and manageability. 
 
Learning Objectives 
This course will enable you to:
  • Understand the principles and patterns associated with microservices
  • Understand the principles and patterns associated with Restful APIs
  • Understand important requirements to consider when migrating a monolithic application into a microservices architecture
  • Understand the benefits of using microservices and associated software patterns and tools to build microservice based applications at speed and scale
  • Understand tradeoffs between different architectural approaches
  • Become familiar and comfortable with modern open source technologies such as Dotnet Core, Docker, Docker Compose, Linux, Terraform, Swagger, React
  • Become familiar with Docker and Container orchestration runtimes to host and run containers, such as Docker Compose, Amazon ECS using Fargate, and Amazon EKS

Prerequisites

  • A basic understanding of software development
  • A basic understanding of the software development life cycle
  • A basic understanding of Devops and CICD practices
  • Familiarity with Dotnet and C#
  • Familiarity with AWS
Intended audience
  • Software Developers and Architects
  • DevOps Practitioners interested in CICD implementation
  • Anyone interested in understanding and adopting Microservices and Restful APIs within their own organisation
  • Anyone interested in modernising an existing application
  • Anyone interested in Docker, and Containers in general
  • Anyone interested in container orchestration runtimes such as Kubernetes

Transcript

- [Instructor] Welcome back, in this lecture, we're going to complete the setup of our Bitbucket pipeline. So the first thing we'll do, is we'll jump into Visual Studio. You'll recall that once our docker images have been built successfully, we pushed them up to our docker hub repository as per the docker push command here. Now for docker push to work, it needs to authenticate against our docker hub registry. And we do that authentication as part this line here. The authentication requires credentials, a username, and a password, and we'll pass those in as environment variables. So we need to set these up in Bitbucket pipeline. So jumping back into our browser, and into Bitbucket, under Settings, we'll scroll down and under Pipelines we'll configure the setting. We need to enable Pipelines so we'll do that here. And then finally, back within the Environment variables section, we need to specify our docker hub credentials. So we'll add the first one, and then we need to set up the password. Back within Visual Studio, we'll make a quick update to our build script file. We'll just add a note, and we'll save it. 

We'll commit this back into our Bitbucket repository to trigger a pipeline build. So git status git add git commit, give it a message. And finally we push it. Okay, we now jump back into the Bitbucket Pipelines view. And if all has gone well, this should trigger a build, and it has, so that's a great result. So we can navigate into it, and we can watch the pipeline build as it happens. The first time it does it, it will have to download the docker images and cache them locally. So this part will take a few minutes but this is a one off, once that cached, subsequent builds will happen a lot quicker. 

So we'll speed this up and see what we get at the other end. So it looks like our docker images have all been built successfully. However I noticed we've got a problem pushing our docker images into the docker hub registry, likely because we've entered the wrong credentials. So we'll need to fix this. Now the last part of this build is taking a while to complete and you'll notice here that we're actually uploading the docker images into our docker cache. This is a technique that allows any other builds later on to actually run a lot quicker. So this one off process of updating the docker images into the docker cache within the pipeline happens the first time that we do a build and the cache is empty. The docker cache within Bitbucket pipeline lasts for 24 hours before it is refreshed. 

Awesome, so you can see that our pipeline is completed, it took nine minutes and 51 seconds, with a successful outcome. So in the background I've gone ahead and updated our docker hub credentials. So these are now using the right ones. And then we're back in the pipelines and I manually triggered a new build. And the outcome of this was execution number three, and again this was successful. Now during the build setup, one thing I want to point out here is that you can see we're now actually downloading from the cache rather than getting the base images from docker hub. 

Therefore the overall build process happens a lot quicker. Next, within our build script file where the core build logic actually runs, if we navigate down to where the push happens, we can now see that we're successfully authenticating to the docker hub registry and that we've successfully pushed up our just built docker image. So if we now jump into docker hub, we can take a look at each of our docker images that have just been built and pushed into our registry. We have one for inventory service, one for store 2018, the presentation layer, one for our account service and one for our shopping service. So when review, we've completed our automated build using Bitbucket pipelines. That completes this lecture, go head and close it, and we'll see you shortly in the next one.

About the Author

Students7746
Labs21
Courses52
Learning paths11

Jeremy is the DevOps Content Lead at Cloud Academy where he specializes in developing technical training documentation for DevOps.

He has a strong background in software engineering, and has been coding with various languages, frameworks, and systems for the past 20+ years. In recent times, Jeremy has been focused on DevOps, Cloud, Security, and Machine Learning.

Jeremy holds professional certifications for both the AWS and GCP cloud platforms.