February 27th, 2018
Serverless CI/CD Tutorial, Part 1: Build
By Stephanie Lingwood

Unit testing Serverless apps with CodeBuild and CodePipeline

This blog kicks off a three-part tutorial on setting up CI/CD pipelines for apps built with the Serverless framework. In this installment, we’ll set up a CI build that uses CodeBuild and CodePipeline to lint code and run unit tests. Next, we’ll cover integration testing using Jest, and create a build that deploys a test environment, tests against it, and tears it down. Finally, we’ll extend our pipeline to deploy to both staging and production environments. All of our code is available in our repo.

Where We Begin

Before we get started, let’s talk about the prior knowledge that this series of tutorials assumes. If you need to go brush up on some topics first, that’s fine! We’ll still be here when you get back.

We start with a todo-list CRUD app, written in Node.js, that has the following in place:

  • GET and POST handlers that respond to API Gateway events
  • A DynamoDB database for persistence
  • Linting using ESLint
  • Unit tests for each handler (we’ve written these using Jest)

All of these elements are configured using the Serverless framework and a serverless.yml.

About CI

Now that we have our app, it’s time to start building our CI/CD pipeline. In this post we’re going to focus on the “CI” part of “CI/CD.” CI stands for “Continuous Integration,” which means that we want to automatically build and test our code every time a commit is pushed.

To do this, we need some sort of build runner—a service that can pull our code (often into a Docker container), install any dependencies, and then execute the lint and test commands against this code. When this “build” is done, our build environment (the container) is disposed of, and a fresh container is used for any subsequent build.

There are lots of CI services out there; Jenkins, TravisCI, and GitLab CI might be familiar names to you. We’ll be using AWS’s CodeBuild, coupled with CodePipeline, because its tight integration with other AWS services makes integration testing and deployment easier. In this blog, we will:

  • Set up CodeBuild, complete with its own S3 bucket
  • Add our CodeBuild project to CodePipeline
  • Create a buildspec.yml file for our CodeBuild project

Let’s get started!

Setting Up CodeBuild

One of the AWS services that CodeBuild integrates with is S3. Before we get started with CodeBuild, we’ll need to create an S3 bucket to store our build artifacts. Build artifacts are files outputted from our builds that we want to save for future use. While we won’t need our artifacts until a later blog post, we have to set up our bucket now (the AWS console currently doesn’t support adding artifacts to an already-existing bucket).

Log in to the AWS console and select “S3” from the Services dropdown. Click Create bucket and enter a name for your bucket. Here, I’m using “serverless-ci-cd-test-artifacts.” The name of my CodeBuild project will be “serverless-ci-cd-test” (more on that later), so I’ve appended “-artifacts” to my build name to make my bucket easier to find. Choose a region and click Next.


The defaults in the Set properties and Set permissions screens are fine, so keep clicking Next. Once you’re at the Review panel, double-check your name and region, and click Create bucket. It should now show in your list of buckets:


With that done, it’s on to CodeBuild. Choose CodeBuild from the Services menu then click on Create Project. First, let’s configure the project name and repo:


Let’s talk for a minute about CodeBuild project naming conventions. In the long run, it’s likely that you will have multiple projects for multiple repos in your account. A simple, yet useful, naming convention is to use the repo name followed by what the project does. In this example, our repo name is “serverless-ci-cd.” Since this a project that will test our code, we’ll name it “serverless-ci-cd-test.”

Also, notice that I haven’t checked the Webhook check box. If this CodeBuild project was going to be a standalone project and not part of a pipeline, we would select that box. Each build would then be triggered via Webhooks sent by GitHub whenever someone pushes to the repo or opens a PR. Since we’re going to add this project into CodePipeline, we don’t need to check this box; our pipeline will trigger builds for us instead.

Next up, we need to configure our build environment. Like many CI services, CodeBuild uses Docker containers. Docker containers are perfect for running builds, since they provide a consistent environment that’s easily provisioned and disposed of. Here, we’re selecting a Docker image for our project. You can either provide your own image or use one provided by CodeBuild. We’re using the CodeBuild Ubuntu image that has Node 7.0.0 installed, since it’s closest to the Node version used by our Serverless app (once you understand this build process, we recommend creating your own image that replicates the AWS Lambda Node.js environment and using that instead).



Note the other bit of setup here: choosing the buildspec.yml as the build specification. Builds need to know what commands to execute. Once we’re done setting up our project in CodeBuild and CodePipeline, we’ll create a buildspec.yml file in our repo that contains the build commands.

Now, it’s time to tell our CodeBuild project to save artifacts in that S3 bucket we created.

And now for the last three settings: Cache, Service Role, and VPC. We don’t need to do anything with Cache and VPC for this project, but we will need an IAM service role; when a build runs, it will assume this role. This will give it permissions to do things like write logs to CloudWatch and write artifacts to S3. Checking Create a service role creates this role for us.

We can skip the advanced settings and click Continue. Review all the settings on the next screen, then click Save. You should now see it in your list of CodeBuild projects.

Great! We now have a CodeBuild project, as well as an S3 bucket for the artifacts we’ll create in the next tutorial. We now need to create a pipeline in CodePipeline and add our CodeBuild project to the pipeline.

Setting Up CodePipeline

It’s worth taking a break here to explain the difference between pipelines and builds. Builds are a lot like functions: they take inputs (a git repo, a group of files you’ve downloaded from S3, etc.), do something with those inputs in a given environment (download dependencies, run tests, create packages, deploy), and optionally produce outputs (artifacts). Pipelines are chains of actions that you tell to run in a certain order. Builds are just one kind of action; your pipeline can also watch for changes to your repo, have approval gates, invoke lambda functions, etc. Those actions are grouped into “stages,” such as “Source” (connect to a source code repo) and “Build” (run a CodeBuild project). CodePipeline requires that pipelines have at least two stages. Pipelines let you watch the entire flow of your code from commit to production deployment.

Let’s go create the pipeline for our app. Select CodePipeline from the services dropdown, then click Create Pipeline. Give your pipeline a name; I usually use my project name (or repo name) to be consistent.

Next, select your source provider. Again, my repo is on GitHub, so I select that and then click Connect to GitHub. You’ll be prompted to authenticate with your source provider, and then choose a repo and a branch to track. CodePipeline will watch that repo/branch and kick off the chain of actions when it changes.

After we click Next step, we’re brought to a screen to add our CodeBuild project to this pipeline:

Once we enter the build information, we enter No Deployment for our deployment provider, and click Next step (we’ll tackle deployment in a later tutorial).


Now we need to select a service role. Our pipeline needs a role to assume that gives it permission to do things—like access CodeBuild projects, CloudWatch events, etc. For now, we’re selecting the ready-made AWS-CodePipeline-Service role. Once you get your pipeline up and running, we recommend creating your own role with narrower permissions.

After clicking Next step, you’re brought to a screen to review your pipeline. Make sure you have all the names and settings correct and click Create pipeline.

Congrats! You’ll see your completed pipeline, with two stages: the Source stage (which tracks changes in your repo) and the Build stage (which will run your CodeBuild project with the latest code from your repo).

Add the buildspec.yml

Finally, it’s time to add the build configuration file, or buildspec.yml file, to our repo. That file will contain the commands for our CodeBuild project, like unit test and linting commands.

Create a buildspec.yml file in the root of your project’s repo that looks like this. Be sure to replace my lint and unit test commands with your own:

version: 0.2

      - npm install
      - npm run-script lint
      - npm test

This will install dependencies (our npm modules) and then lint and unit test the code (if you want more information about buildspec.yml structure, check out the Build Specification Reference). Push these changes to your branch, then watch your pipeline. After a couple minutes, you’ll see your Source stage change to In Progress as it detects the change to your repo, then Success. That will trigger your CodeBuild project. To watch the progress, you can click on the AWS CodeBuild link to navigate to the currently running build; there, logs are output in near-real-time as your tests are run.

When the build completes, head back to your pipeline:

Success! You’ve now got a working pipeline that detects changes to your code, launches a build, and runs your unit tests. In the next tutorial, we’ll deploy our app to a test environment and run integration tests against it. If you have questions and need additional guidance, feel free to email us at info@1Strategy.com.