Skip to content

Building a CI/CD Pipeline for Serverless Applications on AWS with AWS CDK

In 2019, one of the biggest challenges tech companies face is related to capacity. Businesses are forced to deliver working software while the backlog is full and capacity is limited. The concept of automation and tooling isn’t new in this industry; even more so, trying to make an informed decision on which tools best fit your problem domain is becoming more and more difficult given the wide variety of tools available that meet your needs.

We’re going to take a look at how AWS tools—AWS CodePipeline and AWS CodeBuild—can support the software lifecycle by automating a series of delivery steps.

If you’d rather get right to the source code for this post, click CDK Serverless Pipeline Starter Template.

Prerequisites & Notes

You will need the AWS CLI to follow along step by step in this post. If you don’t utilize the CLI by choice, you can also accomplish all of these steps via the AWS Console though your experience may vary.

AWS CDK will need to be installed via NPM, and it’s important to note that we’ll be building our pipeline using Python3.7. There’s a distinct difference between the CDK installation, which provides you CLI access to running CDK commands, and the Python-based CDK source modules, which allow us to write and execute the code for the pipeline. Once the NPM installation is complete, the Python modules will be installed via Pip using the requirements.txt and files located in the root of the project repository.

It’s also expected that your application code is generated using the AWS SAM CLI. If you’re using another framework or rolling your own, you’ll need to modify the build stage of the pipeline to fit that differing workflow. Your application code should be located in a GitHub repository. Configuring a different source is not a major revision. You can find more information in the AWS docs about configuring different sources for your application code.

In short, this pipeline configuration assumes that you’re familiar with or already have set up the AWS CLI, AWS CDK, and have an AWS SAM Application located in a GitHub Repository.


We are going to take advantage of AWS Parameter Store and AWS Secrets Manager to house the runtime parameters for our pipeline. These parameters include:

  1. A primary email address to subscribe to the pipeline notifications; you’ll be notified of Action state changes, such as build start, success, and failure
  2. The name of the repository on GitHub where your application source code is located
  3. The GitHub username or organization name that owns the repository
  4. The GitHub OAuth Token with full permissions for CodePipeline to read and write to your repository

Setting these parameter values into AWS is done either via the console or through the CLI. We’re going to take advantage of the CLI for these API calls:

aws ssm put-parameter \
    --name /serverless-pipeline/sns/notifications/primary-email \
    --description "Email address for primary recipient of Pipeline notifications" \
    --type String \
aws ssm put-parameter \
    --name /serverless-pipeline/codepipeline/github/repo \
    --description "Github Repository name for CloudFormation Stack serverless-pipeline" \
    --type String \
    --value GITHUB_REPO_NAME
aws ssm put-parameter \
    --name /serverless-pipeline/codepipeline/github/user \
    --description "Github Username for CloudFormation Stack serverless-pipeline" \
    --type String \
    --value GITHUB_USER
aws secretsmanager create-secret \
    --name /serverless-pipeline/secrets/github/token \
    --secret-string '{"github-token":"YOUR_TOKEN"}'

Note: These value stores are region-specific, so be sure you deploy these parameters into the same region that you plan to deploy your serverless application.

Pipeline Details


CodeBuild provides a convenient service for configuring your build process, much like TravisCI or Jenkins, despite being somewhat light on features that you might find in more widely used automation tools. As of writing this post, CodeBuild recently released a feature which allows the use of the standard Amazon Linux2 base image, which allows development teams the option to specify one or more programming language versions in the buildspec. This means our Build stage is flexible enough to adapt easily to many different serverless applications or scripting requirements.

A buildspec is a series of phases—such as Installation, Pre-build, Build, and Post-build—which define scripted processes for building and packaging your source code for distribution to deployment stages. Below is an example from this project of how the CodeBuild Project has been configured in our CDK application, and you should note that this Python code mirrors almost exactly what you would expect to see defined in a CloudFormation template for the same resource definition.

build_project = build.PipelineProject(
    self, 'BuildProject',
    description='Build project for the serverless-pipeline',
        'BUILD_ARTIFACT_BUCKET': build.BuildEnvironmentVariable(value=artifact_bucket.bucket_name),
    cache=build.Cache.bucket(artifact_bucket, prefix='codebuild-cache'),
        'version': '0.2',
        'phases': {
            'install': {
                'runtime-versions': {
                    'nodejs': 10,
                'commands': [
                    'echo "--------INSTALL PHASE--------"',
                    'pip3 install aws-sam-cli',
            'pre_build': {
                'commands': [
                    'echo "--------PREBUILD PHASE--------"',
                    '# Example shows installation of NPM dependencies for shared deps (layers) in a SAM App',
                    '# cd functions/dependencies/shared_deps_one/nodejs',
                    '# npm install && cd',
                    '# cd functions/dependencies/shared_deps_two/nodejs',
                    '# npm install && cd',
            'build': {
                'commands': [
                    'echo "--------BUILD PHASE--------"',
                    'echo "Starting SAM packaging `date` in `pwd`"',
                    'sam package --template-file template.yaml --s3-bucket $BUILD_ARTIFACT_BUCKET --output-template-file package.yaml',
            'post_build': {
                'commands': [
                    'echo "--------POST-BUILD PHASE--------"',
                    'echo "SAM packaging completed on `date`"',
        'artifacts': {
            'files': ['package.yaml'],
            'discard-paths': 'yes',
        'cache': {
            'paths': ['/root/.cache/pip'],


In answer to the DevOps community’s cries for support, AWS developed their own pipeline management tool aptly named CodePipeline. While the tool itself is fairly lightweight, in comparison to Jenkins, it does offer several key benefits such as being well integrated with commonly used tools and services. Defining a pipeline is one area which CDK excels by requiring very few lines of actual code to provision the resource and associated stages. Below you’ll see the resource definition for just the pipeline.

serverless_pipeline = pipeline.Pipeline(
    self, 'ServerlessPipeline',

Creating stages for the pipeline can be done via methods on the pipeline construct, as seen below where we define a Source and Build stage respectively. The source code for this project also includes Staging and Production stages.

serverless_pipeline.add_stage(stage_name='Source', actions=[
serverless_pipeline.add_stage(stage_name='Build', actions=[

Note: The Source Stage requires a manual OAuth handshake in the browser be complete before automated deployment can occur. Create a new Pipeline in the console, manually authorize GitHub as a source, and then cancel the pipeline wizard to complete that manual handshake.

Releasing Code

Once your Pipeline deployment with CDK is complete, standard Git/GitHub workflow pushes and merge requests on the Master branch of your repository will trigger a new pipeline deployment of your application. The pipeline is configurable. For example, if you prefer to trigger the pipeline from a different branch, you could quickly change the branch value in the Source stage of your pipeline to meet your needs (you would need to deploy your CDK app again).

The simplicity in having an automated CI/CD Pipeline can have a significant impact on your operational efficiencies, allowing your teams to build faster and deploy more frequently without having to rely on processes that are prone to human error. Don’t forget to check out the source code for a deeper look at how to configure and deploy this starter template.

We’re here to help, too! If you are interested in learning about how 1Strategy can help you optimize security, manage your resources, or control costs on AWS, we’re just an email away at