The Unified Pipeline: CI/CD in the Cloud

For centuries pipelines have been built to facilitate the movement of gas, oil, and other important resources. The concept of the pipeline was built around the principles of moving things fast, reliably, and consistently with little maintenance. In software, these same principles apply.

One of the keys to a successful development shop is how fast and efficiently you can get software in front of customers. No matter how great of a developer you may be, your first iteration probably won’t be perfect and likely will need modifications and several iterations to work out all the kinks. How can we iterate quickly? I believe through having smaller services, isolated deployments, and an efficient pipeline, you can deliver products faster and more reliably. The Unified Pipeline is the bridge to get there.

What is The Unified Pipeline?

In the Cloud era where we are no longer limited to deploying runtimes on dedicated host machines, we now have systems that are composed of heterogenous services. You may have some services in Java, some in Go, some in Python or other languages. Ideally, having a single deployment pipeline that allows software engineers to continuously integrate and deploy any of these technologies lines up well with a Polyglot Cloud environment. In this post, we’ll discuss a method to make this a reality with Jenkins Pipeline.

Jenkinsfile

Jenkins Pipeline uses a Jenkinsfile for a declarative pipeline configuration. There’s a lot of benefits to using a Jenkinsfile over conventional build configurations and you can find many articles about that topic. One of the major drawbacks of code as configuration is the code can get large, which is a major headache when being repeated across multiple projects. I found that once I got a really good pipeline configured, I needed most, if not all of the same configuration for my next service. This became a problem as we copied big Jenkinsfiles from one micro service to the other. In some cases, I used a different language but the build was nearly the same every time. The steps pretty much were Build, Test, Dockerize, Deploy, Integration Test. Although this is a simplified version of what my end result came out to be, the stages were consistent for the majority of my micro services.

Jenkins Library

I decided to use my pipeline as a template and build a Jenkins Library that could be leveraged in any project. Yes, I wanted a reusable pipeline that behaved like any other component in my infrastructure. The trick was making the entire pipeline configurable so different runtimes could be supported using CI containers and commands could be configured to run in each stage. Here’s your “Hello World” example of a typical pipeline.

As you can see, the above pipeline has a make dependency. What if we used maven? Well, the container would need to pull in maven and use the appropriate maven commands in the right stages. Same thing if we used gradle or any other build tool. I decided to take my entire pipeline, like the one above, harden it, and make it completely configurable. This allowed me to have the code in one place where I could version it, upgrade it, fix bugs, etc. and allow every micro service to receive bug fixes or feature additions by changing the library dependency. Here’s a VERY scaled down version of what my Jenkins library looks like:

The above pipeline would be added to your Jenkins Lib project as a Groovy var. So if I named the above as myPipeline.groovy in a jenkins project named my-pipeline-lib; I could use myPipeline in any build that adds the lib as a dependency. Here’s an example of the original pipeline now using the my-pipeline-lib from the master branch. It passes all of the configurations to the pipeline as a map.

Conclusion

This is a very very simple version of what I’m actually using to do CI/CD but you can see the benefits. My current solution has many features and went through several iterations to get things exactly the way I wanted it. I now have a single pipeline that can run any runtime I configure. All of my services are backed by the exact same code and if a bug is found, it can be fixed in one place. You still get the benefits of Pipeline as Code but without the repeated/copy pasted Jenkinsfiles. Additionally, this gave me a way to add logging, slack notifications, docker cleanup, private repository access, and more in a Unified Pipeline across all projects. Bootstrapping new micro services has become easy with a consistent build mechanism, allowing engineers to focus on what’s most important, the software.

Thanks for taking a look and checkout the links below for additional information on building Jenkins libraries.

Helpful Resources

How to build a Jenkins Library
Groovy Docs

Leave a Reply

Your email address will not be published. Required fields are marked *