The course is part of this learning path
Test-Driven Development (TDD) is a discipline that helps to create error-free, quality code, and drives productivity by reducing and eliminating errors as well as providing many other benefits.
This entry-level training course is designed to bring you quickly up to speed with the basic features and processes involved in performing Java based test-driven development using JUnit5. It covers the key principles and practices of test-driven development, which you can then utilize within your own software engineering projects.
In this course, you'll be guided through an end-to-end sample project in which we use test-driven development to build, test, and package a Java 11 based library that will provide an API to convert Bitcoins into foreign currency.
If you have any feedback relating to this course, please contact us at email@example.com.
- Understand the fundamentals of TDD and JUnit5
- Use TDD to build, test, and package a Java 11 based library
- Perform TDD by performing Red-Green-Refactor iterations
- Learn how to use TDD to build, test, and package a Java 11 based library
- Learn how to create unit tests using the JUnit5 testing framework
- Software developers interested in using TDD to build JDK11 based applications using Java
To get the most out of this course, you should have a basic understanding of JDK11 and the Java programming language, as well as software development and the software development life cycle (SDLC).
Welcome back. In the previous lesson, I showed you how to use a GitHub action to do automated CICT builds of our Java project. In this lesson, I'm going to refactor a single build dot eml file, which is how GitHub action work flow into a dev specific work flow in a prod specific work flow.
In the dev work flow, I'm going to introduce an extra build step, in which I generate code coverage data. And then upload that data into Coveralls dot io. Which is an excellent online free servers for doing visualizations of code coverage reports. Which will help us highlight areas of our code that are being exercised, and more importantly, identify areas within our implementation for which our current state of unit test are not exercising.
Let's begin. I'll start by duplicating the build dot eml file. I renamed the first one to be dev dot build dot yml. And the second one will be, prod dot build dot eml. Within the dev one. And the Maven build step, I'm going to add in the following. This is our code coverage build step. Now what this does is it will generate a co-coverage report when our units here are performed. And it's going to upload it into Coveralls dot io.
So I'll jump over into my browser and I'll go to Coveralls. And here you can see my Coveralls dot io account. It's currently empty, but once we've set up our new GitHub actions build for our dev environment, you'll find that the code coverage report data will be automatically pushed up into Coveralls. Now for this to happen, we need to also configure a secret token here. Now, the way I've set this up is to leverage GitHub itself and to store a secret name Coveralls in here with the token. So let's set that up now.
So back within GitHub under settings for the current repo, I'll go to secrets and I'll create a new repository secret. I'll paste in Coveralls. And then we need to put the token in here. So, okay back to the Coveralls account, I'm going to add a repo and I click on the sync repos button here to make sure I've got all of the current repos in my GitHub account. I'm then going to search for the new repo that I've just created, demo dash Java and turn it on.
Okay back to our home page. I now click on this guy. I'll click on settings. In here we've got a repo token. So I'll copy this. Okay, back to my secrets and I'll paste the value here. I'll click add secret and there now has our secret installed. I go back to the source code. So when our GitHub action runs, and this job step attempts to push the code coverage data up into Coveralls, it will use that particular secret. And the good thing about this is that we don't need to store the secret literally in the work flow itself, because remember the work flow will be uploaded and pushed into the GitHub repository.
Okay, I'll save that. And that completes the setup for the dev work flow, I'll now work on the prod one. So the only difference here, is that I'm going to add on a build step at the end, that's going to automatically make a release for us. It's going to do so by using the jar file that is generated when our Maven build runs and gets stored in the target directory.
So on prod work flow, won't do the code coverage upload into Coveralls. Only the dev work flow will. The other thing I'll do is I'll update the trigger. So in this case, I'm going to change it so that this only triggers when a tag is pushed up into the repo. Save that and then we're almost good to go. However, we need to update our Maven pom file. Now, the reason being is that we need to give it the capabilities of generating the code coverage data points. They get pushed up into Coveralls during the dev work flow.
So to do that, I'm going to scroll down into the plugin area, jump back into my browser. I'll go back to the root of our project and then I'll navigate into step six. Now, this is the two plugins that we need to generate our code coverage. Okay so back within our project in our pom dot xml file. I've pasted in those two plugins.
Now the details as to what each plugin does, the following. The first plugin is used to actually generate the unit testing code coverage reports. And the second one is used to do the uplift of the code coverage report data and to Coveralls for visualization. Okay, I'll save that. Now, the next thing I wanna do is quickly make an update to our directory structure.
Now, the reason I wanted to do this before we run the code coverage reports is that Coveralls will expect that the package structure is reflected on the file system as it is this convention that is used by Coveralls to map the right code coverage reports back to the right classes that were tested. Okay. So that looks good. I'll then do a git status and followed by a git add to add back in the untracked files. So GitHub and source. I'll do git status again, make sure I've got everything. That looks good. I'll now commit and push.
Okay. So all of our updates are now being pushed back up into our GitHub repo. Let's jump over into it. I'll click on actions. And we can see that updates has been queued for the dev work flow. Now the prod one hasn't triggered and that's expected because we have configured that to only trigger when we do a tag and we push a tag up into the repo.
So let's drill into the dev work flow. In particular, we're interested in how the code coverage job step goes. Okay. So the code coverage job step is now running and so far so good. So this is generating the code coverage reports and it's also going to push the reports up into Coveralls. Brilliant. Our latest version of the dev work flow has completed. It's been successful. And if we now jump over into Coveralls, absolutely brilliant.
We can say that our unit testing code coverage report has been pushed up into Coveralls, and it's been determined that we've got 91% code coverage, which is not there. If we drill into it, we can actually drill down into the source. Eventually we'll get to the file. And if we click on it, we can see the source code that makes up this file. And anywhere there is a green highlighted section. This tells us that our unit tests have exercised this part of the code base. And anywhere it's red tells us where our current unit test could be improved to test other areas of code that have been missed. So all in all, this is a great result.
Let's jump back into our editor and this time let's trigger a production work flow build now to do so. We need to add a tag, but before I do, I'm going to jump into the pom dot xml file. I'll scroll to the top. And currently we're building a snapshot version. Now our production work flow is really about doing a proper release for this. What we'll do, is we'll remove the snapshot part of the version. We'll save it. We'll then run, git status I'll add the pom file in. I'll do a git commit, give it a message. Release version one dot zero dot zero. I shall push up. And this time, I'll do a git push and I'll push tags.
Okay. I can see our new tag and I'll jump over into our repo. click on actions and excellent. We can see here that we've triggered on our latest tag version one dot zero dot zero, and that it's triggered the production work flow. We'll drill into it. Now again, we set up our production work flow to be the same as the dev one. Except we're not going to do the code coverage aspect. That's just where the dev work flow. But instead we're going to run the make release job step. Which will produce us an official release.
Okay. It looks like it's wrapping up and indeed it has. And again, the build job for production has completed successfully. So this is a great result. So if we go to the project root and we now look at releases. We can see we've got our first official release We'll click on it. And here we can see the assets that make up that official release.
As expected, we've got our Bitcoin converter service library version one dot zero dot zero. We can download this. We'll keep it. Okay. That completes this demonstration, in which I showed you how to refactor the single work flow into a dev specific work flow in a prod specific work flow. The dev work flow introduced code coverage reports and uploaded the resulting data into Coveralls for visualization and reporting. And the production work flow introduced a release.
Okay, go ahead and close this lesson and I'll see you shortly in the next one, where we build a Java console app that imports in users this release of our Bitcoin converter library.
Jeremy is a Content Lead Architect and DevOps SME here at Cloud Academy where he specializes in developing DevOps technical training documentation.
He has a strong background in software engineering, and has been coding with various languages, frameworks, and systems for the past 25+ years. In recent times, Jeremy has been focused on DevOps, Cloud (AWS, GCP, Azure), Security, Kubernetes, and Machine Learning.
Jeremy holds professional certifications for AWS, GCP, and Kubernetes.