The Unified Pipeline: CI/CD in the Cloud

For centuries pipelines have been built to facilitate the movement of gas, oil, and other important resources. The concept of the pipeline was built around the principles of moving things fast, reliably, and consistently with little maintenance. In software, these same principles apply.

One of the keys to a successful development shop is how fast and efficiently you can get software in front of customers. No matter how great of a developer you may be, your first iteration probably won’t be perfect and likely will need modifications and several iterations to work out all the kinks. How can we iterate quickly? I believe through having smaller services, isolated deployments, and an efficient pipeline, you can deliver products faster and more reliably. The Unified Pipeline is the bridge to get there.

What is The Unified Pipeline?

In the Cloud era where we are no longer limited to deploying runtimes on dedicated host machines, we now have systems that are composed of heterogenous services. You may have some services in Java, some in Go, some in Python or other languages. Ideally, having a single deployment pipeline that allows software engineers to continuously integrate and deploy any of these technologies lines up well with a Polyglot Cloud environment. In this post, we’ll discuss a method to make this a reality with Jenkins Pipeline.

Jenkinsfile

Jenkins Pipeline uses a Jenkinsfile for a declarative pipeline configuration. There’s a lot of benefits to using a Jenkinsfile over conventional build configurations and you can find many articles about that topic. One of the major drawbacks of code as configuration is the code can get large, which is a major headache when being repeated across multiple projects. I found that once I got a really good pipeline configured, I needed most, if not all of the same configuration for my next service. This became a problem as we copied big Jenkinsfiles from one micro service to the other. In some cases, I used a different language but the build was nearly the same every time. The steps pretty much were Build, Test, Dockerize, Deploy, Integration Test. Although this is a simplified version of what my end result came out to be, the stages were consistent for the majority of my micro services.

Jenkins Library

I decided to use my pipeline as a template and build a Jenkins Library that could be leveraged in any project. Yes, I wanted a reusable pipeline that behaved like any other component in my infrastructure. The trick was making the entire pipeline configurable so different runtimes could be supported using CI containers and commands could be configured to run in each stage. Here’s your “Hello World” example of a typical pipeline.

pipeline {
    agent any 

    stages {
        stage(Build) { 
            steps { 
                sh 'make' 
            }
        }
        stage(Test) {
            steps {
                sh 'make check’
            }
        }
        stage(Deploy) {
            steps {
                sh 'make publish'
            }
        }
    }
}

As you can see, the above pipeline has a make dependency. What if we used maven? Well, the CI container would need to pull in maven and use the appropriate maven commands in the right stages. Same thing if we used gradle or any other build tool. I decided to take my entire pipeline, harden it, and make it completely configurable. The trick was allowing users to declare their own CI container with whatever dependencies they needed. So now I could have common tooling with a declarative environment for building and testing any runtime. Every micro service, irrespective of runtime or build dependencies, could get the same benefits of simple deployments, metrics, key management, build notifications and anything else that could increase productivity. Awesome!

Additionally, we got the added benefit of having the template code in one place. Now I can version it, add features and fix bugs with an easy way to distribute changes across the org. In practice, 1000s of micro services receive bug fixes or feature additions just by changing the library dependency (I even have a mechanism to auto update backward compatible fixes without the need for manual intervention, yes!!!!). Here’s a VERY scaled down (Hello World version) of what my Jenkins library looks like:

def call(Map config) {
    pipeline {
        agent any

        stages {
            stage('Build') {
                agent { docker { image config.environment.DOCKER_CI_IMAGE }}
                steps { sh "${config.stageCommands.get 'build'}"}
            }    
            stage('Test') { 
                    agent { docker { image config.environment.DOCKER_CI_IMAGE }}               
                    steps { sh "${config.stageCommands.get 'test'}"}               
            }                       
            stage('Deploy') { 
                    agent { docker {image config.environment.DOCKER_CI_IMAGE }}               
                    steps { sh "${config.stageCommands.get 'deploy'}" }
            }
        }
    }      
}

The above pipeline would be added to your Jenkins Lib project as a Groovy var. So if I named the above as myPipeline.groovy in a Jenkins project named my-pipeline-lib; I could use myPipeline in any build that adds the lib as a dependency. Here’s an example of the original pipeline now using the my-pipeline-lib from the master branch. It passes all of the configurations to the pipeline as a groovy config file, including the CI imaged needed with its dependencies baked in (make, maven, testing tools, etc).

@Library('my-pipeline-lib@master') _

myPipeline([
    environment: [
        DOCKER_PROJECT: 'my-starter-project',
        DOCKER_CI_IMAGE: 'my-CI-image-with-make',
    ],     
    stageCommands: [
            build: 'make build',
            test: 'make check',
            deploy: 'make publish',                  
        ]
])

Conclusion

This is a very very simple version of what I’m actually using to do CI/CD but you can see the benefits. My current solution has many features and went through several iterations to get things exactly the way I wanted it. I now have a single pipeline that can build any runtime configured by the engineers. All of my services are backed by the exact same build template and if a bug is found, it can be fixed in one place. You still get the benefits of Pipeline as Code but without the repeated/copy pasted Jenkinsfiles. Additionally, this gave me a way to add logging, slack notifications, docker cleanup, private repository access, and much more in a Unified Pipeline across all projects. Bootstrapping new micro services has become easy with a consistent build mechanism, allowing engineers to focus on what’s most important, the software.

Thanks for taking a look and checkout the links below for additional information on building Jenkins libraries.

Helpful Resources

How to build a Jenkins Library
Groovy Docs

Leave a Reply

Your email address will not be published. Required fields are marked *