June 29, 2021

What is a Containerized Pipeline?

Table of Contents

Key takeaway

Kubernetes has become a big player in the software delivery world as people are shifting to use containers for an easier and more efficient way to develop, ship, and deploy code. A container is a standard unit of software that is packaged up with code and all of its dependencies so that the application can run independently and efficiently in one computing environment. 

What Are Pipelines in Software Development?

Like the standard definition of a pipeline, pipelines in software development are used as a method to carry something from place to place - in our case, carrying code. Pipelines exist to streamline the process of software development. They help bring automation into DevOps, specifically in Continuous Integration and Continuous Delivery by taking source code, packaging it up into an artifact, and then eventually deploying it to production. Within a pipeline, a developer would specify a container to perform a certain job. It will first fetch and start the container, then each step of the job will be run inside the container until complete. This process will continue throughout the pipeline until the end is reached.

Why Containerized Pipelines?

Knowing the benefits of containers, why not put pipelines in them? Containerized pipelines allow for isolation between pipeline jobs. Every single job runs off of a container image which includes all the dependencies needed for that application. Because of this, there aren’t any dependency issues between pipeline jobs - even if they run on the same node! Containers also allow for immutability - once it’s built, it is unchangeable, and if you want to make a change you need to create a new image. Following along with this concept, each pipeline job runtime will be exactly the same. Also with containerized pipelines, the shift to a Kubernetes cluster is easy and allows for scalability. Leveraging Kubernetes as a resource for pipelines is definitely an added benefit.

Different Stages of Pipelining in Software Delivery

DevOps Best Practices Banner


The build stage is where the entire process starts. This is the stage where the application is compiled and is translated from a human readable source code into an executable program. In a pipeline, a build is usually triggered by a commit or a change in code. Within the process, the code is fetched from the source code/git repository, it is compiled, and dependencies are checked. With containerized pipelines, the build stage can be more easily managed due to no dependency hell. Container images package up all the dependencies needed in the application so that the developer does not need to manage this. Docker, an open platform for developing, shipping, and running applications, makes this easy. This platform enables developers to create Dockerfiles, which are the configuration files for a Docker image, as well as a Docker container, an executable version of the image and a Docker registry where images are stored.


Testing follows the build stage. There are many different testing types such as unit testing, integration testing, and vulnerabilities testing. Each type of testing can be its own step in the pipeline. With all the different testing methodologies, developers don’t have to worry about having the correct dependencies, as each container image within the pipeline will have the corresponding packages.


Next, the deploy state is where the code is deployed to production and is facilitated through Continuous Deployment. With the use of containers, deploying an application is easier and more secure. An application can be modified and run instantly.


Finally, we hit the release stage, where the application is delivered to a repository. This is the true stage of Continuous Delivery. With pipelines and automation, this process has become simple - and with the addition of containers, it is that much more efficient and secure.

Different Containerized Pipelines Options

Azure DevOps

Azure DevOps encompasses all the tools a developer may need to successfully run DevOps. It has project management, source control management, and CI/CD. This allows developers to dictate and configure each part of the software delivery lifecycle. Azure DevOps also integrates well with containers, as it offers features like integration and access to Azure Container Registry and Azure Kubernetes Service. To configure pipelines, developers can either leverage the web-based UI or check in YAML configuration files into the application source code. Since Azure DevOps is an all-in-one solution, it is great when using the entire suite of products. However, if you’re only using a piece or two, it can be difficult to set up and can be overly complex for teams that want a basic container pipeline. 

GitLab CI/CD

GitLab CI/CD also provides an all-in-one solution including project management, private container registries, and orchestrated build environments. To build a containerized pipeline, a developer must configure the gitlab-ci.yml file as GitLab CI/CD is powered by GitLab Runner. Because it leverages YAML for its configuration files, GitLab CI/CD allows for a lot of customization and flexibility. It is also integrated with open-source tools that allows support for a wide range of project types, languages, and deployment targets. On the down side, GitLab can’t provision or maintain environments besides Google Kubernetes Engine and Amazon Elastic Kubernetes Service, and therefore, the entire process isn’t fully automated. It also lacks a graphical user interface which enhances developer experience as it makes it easier to build and maintain complex pipelines.   

Jenkins X

Everyone knows about Jenkins as it is the most popular CI/CD solution out there. Jenkins X takes it a step further with its comprehensive integration with Kubernetes as it can not only deploy, but also provision and manage the cluster for developers. Jenkins X pipelines are built on Tekton pipelines, which help run CI/CD pipelines on Kubernetes. Instead of using Jenkinsfiles for configuration, in Jenkins X, developers must configure their pipelines using the jenkins-x.yml file. Jenkins X combines the two most leveraged open-source technologies in the DevOps space at the moment: Jenkins and Kubernetes. Jenkins X natively integrates with the Kubernetes infrastructure, and although it could be a benefit, it could also be a limitation as Jenkins X requires Kubernetes and is very opinionated on how the cluster is configured. As far as CD tools go, it’s an option we can recommend.

How Harness Utilizes Containerized Pipelines

The Harness Platform is like a universal adapter. It can be plugged in to connect any cloud provider, container platform, or tech integration. Because of how versatile it is, the platform can connect with any container orchestration tool to create containerized pipelines.

Harness CI Enterprise, the newest module of the Harness Platform, features the functionality of containerized steps in each step of the pipeline. Within the pipeline, the developer specifies a container to be used and the agent fetches and starts the container in which the job will run. Because each step is run in its own container and all the plugins have their own containers, the developer doesn’t need to worry about dependency hell. 

Take the Next Step in CI/CD

Pipelines have such a big impact in the DevOps world as they help automate and streamline the software delivery process. To take a deeper dive into common pipeline patterns, their strengths, their weaknesses, and how to pick the best one for your given use case, check out our Pipeline Patterns eBook.

Harness has created an end-to-end platform to help developers with their software delivery process. To check out how we take advantage of containerized pipelines, sign up for a free trial today!

You might also like
No items found.

Similar Blogs

No items found.