Contents
Introduction to AWS CodeBuild
In this course, we explain the basics of AWS CodeBuild and how it can be used to compile, build, and test your source code.
Learning Objectives
By the end of this course, you will have a greater understanding of the AWS CodeBuild service, including
- What the service is and the benefit it provides
- What a buildspec file is
- How to use AWS CodeBuild to create a deployable artifact
Intended Audience
- Those who are implementing and managing software builds on AWS
- Those who are looking to take a certification, such as the AWS Certified Developer - Associate certification
Prerequisites
- A fundamental understanding of AWS, including services such as Amazon S3, Amazon CloudWatch, and AWS IAM
- It would be helpful to understand some basic principles of code, as well as YAML files and some knowledge of Docker containers
- For more information on these services, check out the following courses titled:
AWS CodeBuild is a fully managed build service. It’s designed to compile your source code, run unit tests, and then create an artifact that can eventually be deployed. Since it’s fully managed, it does this all without you having to provision, scale, or maintain your own build servers.
AWS CodeBuild is commonly used in CI/CD pipelines, with other AWS or external code tools. For example, you can use CodePipeline to glue services like CodeCommit, CodeBuild, and CodeDeploy together. When these services are used together, commits into a CodeCommit repository can trigger CodeBuild to kick off and create the build artifacts.
There are three core concepts of CodeBuild: a CodeBuild project, a build environment, and a buildspec file. In the service console, the first thing you’ll do is create a CodeBuild project. This project is where you specify all of the information needed to perform your build.
You’ll specify where to get your source code. You’ll specify if you’re storing it in Amazon S3 or in a repository in services such as AWS CodeCommit, Github, Github Enterprise, or BitBucket.
You’ll then specify how you want CodeBuild to build your environment. Build environments are Docker containers that include everything needed to manage and execute your build - this is where the true work of compiling your source code, running your unit tests, and creating a deployable package really happens. You can choose to use a managed Docker image or create a custom Docker image that’s preloaded and configured with whatever build and software tools you require.
The managed Docker image supports Amazon Linux 2 and Ubuntu workloads. Depending on your Region, if you’re using Ireland, Oregon, Northern Virginia, or Ohio Regions, you have an additional option to use a Windows Server Core 2019-based image. Each of these images support a variety of runtimes, such as Java, .NET core, Ruby, Python, Go, NodeJS, Android.
If you’d like to use your own custom Docker image, they support images using ARM, Linux, Linux GPU, and Windows 2019. However, depending on the Region you choose, these options may differ.
To use your own custom image, you’ll need to upload this image to either Amazon Elastic Container Registry or an external registry, such as the Docker Hub Registry. Once you specify the repository, CodeBuild will retrieve the image from the registry and use it to launch your build environment.
Just as your build environment contains everything you need to execute a build, you still need to tell CodeBuild how to build your code. The way you do this is through build commands. There are two main ways you can specify build commands: by writing the build commands directly in CodeBuild or by using a buildspec file.
Inserting your commands directly is a very quick and simplistic way to run your build, especially if you only have a few commands to run. However, if you have a more extensive build process requiring more complex commands to create your artifact, you’ll want to use a buildspec file instead.
Buildspec files are YAML-based files that enable you to specify what commands to run during the different phases of your build, such as the install phase, pre-build, build, and post-build.
For example, if you’d like to create a deployable Java artifact, you may want to run the command mvn install to install Apache Maven during the build phase. Using these commands, CodeBuild will convert the inputted Java to an ouputted artifact file called messageUtil-1.0.jar, which it will then upload to an S3 bucket.
The benefit to using a buildspec is you can manage this YAML file like you would any piece of code, through source control - enabling you to better maintain and track changes in the buildspec over time.
Moving onto the next part of CodeBuild. The CodeBuild project knows where to get the code, how to create the execution environment, how to complete the build with proper instruction from the buildspec, so the last big piece it needs is where to put the output artifact.
When a build project completes, you have the option to save your build artifacts to an S3 bucket. By doing this, you can leverage native S3 controls to manage your builds, such as S3 bucket triggers. These triggers will enable you to trigger other actions when your build is uploaded. For example, once your output is placed in S3, you can kick off a Lambda function to perform some kind of action.
Currently, S3 is the only supported output repository at the moment. However, using the buildspec enables you to have a bit more flexibility in where you send your output artifacts. For example, one use case for CodeBuild is to build docker containers. In your buildspec, you can specify during the post-build phase to run the docker push command to an ECR repository, thus outputting the built image to a location separate from S3.
Once you kick off your build in CodeBuild, you can monitor its status. CodeBuild also publishes events to CloudWatch to ensure that you can detect and react to changes in the state of your build.
In summary, you create a CodeBuild project, that specifies the docker image needed to create the build environment, the buildspec file that provides instructions to build your code and specify optional storage for your output. That’s it for this one - I’ll see you next time.
Alana Layton is an experienced technical trainer, technical content developer, and cloud engineer living out of Seattle, Washington. Her career has included teaching about AWS all over the world, creating AWS content that is fun, and working in consulting. She currently holds six AWS certifications. Outside of Cloud Academy, you can find her testing her knowledge in bar trivia, reading, or training for a marathon.