Chapters
Try It For Free
December 30, 2025

Harness Dynamic Pipelines: Complete Adaptability, Rock Solid Governance

Harness Dynamic Pipelines offers an option to create pipelines, or pipeline stages, at runtime

For a long time, CI/CD has been “configuration as code.” You define a pipeline, commit the YAML, sync it to your CI/CD platform, and run it. That pattern works really well for workflows that are mostly stable.

But what happens when the workflow can’t be stable?

  • An automation script needs to assemble a one-off release flow based on inputs.
  • An AI agent (or even just a smart service) decides which tests to run for this change, right now.
  • The pipeline is shadowing the definition written (and maintained) for a different tool

In all of those cases, forcing teams to pre-save a pipeline definition, either in the UI or in a repo, turns into a bottleneck.

Today, I want to introduce you to Dynamic Pipelines in Harness.

Dynamic Pipelines let you treat Harness as an execution engine. Instead of having to pre-save pipeline configurations before you can run them, you can generate Harness pipeline YAML on the fly (from a script, an internal developer portal, or your own code) and execute it immediately via API.

See it in action

Why Dynamic Pipelines?

To be clear, dynamic pipelines are an advanced functionality. Pipelines that rewrite themselves on the fly are not typically needed and should generally be avoided. They’re more complex than you want most of the time. But when you need this power, you really need it ,and you want it implemented well. 

Here are some situations where you may want to consider using dynamic pipelines.

1) True “headless” orchestration

You can build a custom UI, or plug into something like Backstage, to onboard teams and launch workflows. Your portal asks a few questions, generates the corresponding Harness YAML behind the scenes, and sends it to Harness for execution.

Your portal owns the experience. Harness owns the orchestration: execution, logs, state, and lifecycle management. While mature pipeline reuse strategies will suggest using consistent templates for your IDP points, some organizations may use dynamic pipelines for certain classes of applications to generate more flexibility automatically. 

2) Frictionless migration (when you can’t rewrite everything on day one)

Moving CI/CD platforms often stalls on the same reality: “we have a lot of pipelines.”

With Dynamic Pipelines, you can build translators that read existing pipeline definitions (for example, Jenkins or Drone configurations), convert them into Harness YAML programmatically, and execute them natively. That enables a more pragmatic migration path, incremental rather than a big-bang rewrite. It even supports parallel execution where both systems are in place for a short period of time. 

3) AI and programmatic workflows (without the hype)

We’re entering an era where more of the delivery workflow is decided at runtime, sometimes by policy, sometimes by code, sometimes by AI-assisted systems. The point isn’t “fully autonomous delivery.” It’s intelligent automation with guardrails.

If an external system determines that a specific set of tests or checks is required for a particular change, it can assemble the pipeline YAML dynamically and run it. That’s a practical step toward a more programmatic stage/step generation over time. For that to work, the underlying DevOps platform must support dynamic pipelining. Harness does.

How it works

Dynamic execution is primarily API-driven, and there are two common patterns.

1) Fully dynamic pipeline execution

You execute a pipeline by passing the full YAML payload directly in the API request.

Workflow: your tool generates valid Harness YAML → calls the Dynamic Execution API → Harness runs the pipeline.
Result: the run starts immediately, and the execution history is tagged as dynamically executed.

2) Dynamic stages (pipeline-in-a-pipeline)

You can designate specific stages inside a parent pipeline as Dynamic. At runtime, the parent pipeline fetches or generates a YAML payload and injects it into that stage.

This is useful for hybrid setups:

  • the “skeleton” stays stable (approvals, environments, shared steps)
  • the “variable” parts (tests, deployments, validations) are decided at runtime

Governance without compromise

A reasonable question is: “If I can inject YAML, can I bypass security?”
Bottom line: no.

Dynamic pipelines are still subject to the same Harness governance controls, including:

  • RBAC: users still need the right permissions (edit/execute) to run payloads
  • OPA policies: policies are enforced against the generated YAML
  • Secrets & connectors: dynamic runs use the same secrets management and connector model

This matters because speed and safety aren’t opposites if you build the right guardrails—a theme that shows up consistently in DORA’s research and in what high-performing teams do in practice.

Getting started

To use Dynamic Pipelines, enable Allow Dynamic Execution for Pipelines at both:

  • the Account level, and
  • the Pipeline level

Once that’s on, you can start building custom orchestration layers on top of Harness, portals, translators, internal services, or automation that generates pipelines at runtime.

The takeaway here is simple: Dynamic Pipelines unlock new “paved path” and programmatic CI/CD patterns without giving up governance. I’m excited to see what teams build with it.

Ready to try it? Check out the API documentation and run your first dynamic pipeline.

Eric Minick

Eric Minick is an internationally recognized expert in software delivery with experience in Continuous Delivery, DevOps, and Agile practices, working as a developer, marketer, and product manager. Eric is the co-author of “AI Native Software Delivery” (O’Reilly) and is cited or acknowledged in the books “Continuous Integration,” “Agile Conversations,” and “Team Topologies.” Today, Eric works on the Harness product management team to bring its solutions to market. Eric joined Harness from CodeLogic, where he was Head of Product.

Similar Blogs

Continuous Integration
Continuous Delivery & GitOps