Skip to main content

Jenkins shared libraries: tested

By 15 oktober 2017oktober 18th, 2017No Comments

Jenkins is a very neat tool to implement a continuous delivery process, mainly due to its flexibility. Sometimes it can be hard though to keep complexity low, and when that happens, (automated) tests become far more important.

Jenkins should in fact be running tests that verify the scripts running tests, proving they actually work. Warning: this dog will chase its own tail.



Jenkins pipeline scripts

Once, not so long ago, the way to go was to manage jobs manually using the GUI. At the moment Cloudbees is proposing pipelines as scripts, either scripted procedurally or declarative.

Declarative pipelines are still in development. Their primary intent is to be suitable to be edited from a simple GUI.


Extending Jenkins scripting

In the world of open source, integration is becoming the only major thing a company needs to implement themselves.

Using just a single script however will get you only so far. At a certain point you will want to introduce abstractions.


The Jenkins Shared library

Cloudbees recently introduced a way of having more than just a script, but keeping things relatively easy and secure to use.

The ‘shared library’ is a Jenkins concept that allows projects to declaratively import a full repository of classes and scripts.

In short, any shared library will have a few common folders:

/ -- Jenkins will load the complete repository, the following places are important:

/src -- classes, optionally using @Grab to get dependencies
/vars -- scripts, closures, dsl building
/resources -- anything else, stuff


It comes with tests

As with all new areas of applying code, once maturity is reached, tests become relevant. Fortunately there is a way to run tests for the pipeline code, be it scripts or (typed) classes.

These tests will lean towards unit tests even though the unit boundaries for pipelines are not that clear. Interaction with external systems is of course not something best covered using unit tests. The following examples are aimed primarily at testing groovy compilation, execution and (emulated) interpretation by Jenkins. When defining clear units (classes), these can become proper targets for unit tests once more.


JUnit example

Note that the code is groovy, but it can just be java, just add semi colons 😉 (okay maybe a few anonymous classes etc, but should be possible).

package com.first8

import org.junit.Before
import org.junit.Test

import com.lesfurets.jenkins.unit.BasePipelineTest

class BasicPipelineTest extends BasePipelineTest {
    void before() {
    helper.registerAllowedMethod("pwd", []) { "/tmp/testworld/doesnotexist" }
    helper.registerAllowedMethod("stash", [Map.class]) { println "mock stash called." }
    void happyFlowLoading() throws Exception {
        Script script = loadScript("resources/com/first8/grails3/DefaultJenkinsfile")

Spock example

Since the pipeline scripts use groovy, one might as well write tests using groovy.

abstract class JenkinsSpecification extends Specification implements RegressionTest {

     * Delegate to the junit cps transforming base test
    BasePipelineTestCPS baseTest

     * Do the common setup
    def setup() {

        // Set callstacks path for RegressionTest
        callStackPath = 'jenkinsSpec/callstacks/'

        baseTest = new BasePipelineTestCPS()
class ExampleSpec extends JenkinsSpecification {

    def setup() {
        // like with junit, the helper is available here

    void cleanup() {

    def "example spec"() {

            println "nothing happens in this spock specification, success!"
            assert true



Project layout

To be able to work with a shared library, a project setup would be convenient, so why not use groovy through gradle? The layout of a Jenkins shared library is not standard for gradle. This means that a little config is needed.

Once the configuration is in place, the project will build quite easily. There are some snags though, which will show in the gradle config file below. The little problems together make the build.gradle file not quite as trivial.



Gradle config

Check the following build.gradle config, there are hints on possible difficulties in the comments.

// Apply the java plugin to add support for Java
apply plugin: 'java'
apply plugin: 'groovy'
apply plugin: 'eclipse'
apply plugin: 'idea'
apply plugin: 'project-report'

targetCompatibility = '1.8'
sourceCompatibility = '1.8'

model {
    components {
        main(JvmLibrarySpec) {
            targetPlatform 'java8'

// follow the structure as dictated by Jenkins:
sourceSets {
    main {
        groovy {
            srcDirs = ['src','vars']
        resources {
            srcDirs = ['resources']
    test {
        groovy {
            srcDirs = ['test']

// allow for the (pipeline code) Ivy Grab/grape system to do some setup...
String home = System.getProperty("user.home")
task intializeGrapeConfig(type: Copy) {
    doFirst {        
        println "installing grape config in: ${home}/.groovy"
    from '.'
    include "grapeConfig.xml" // assuming that this file is in the shared library!
    into "${home}/.groovy/"
} = "build setup"

// In this section you declare where to find the dependencies of your project
repositories {
    // Use 'jcenter' for resolving your dependencies.
    // You can declare any Maven/Ivy/file repository here.
    maven {
        url ""

// In this section you declare the dependencies for your production and test code
dependencies {
    // to be sure @Grab compile time dependency downloading works in scripts, have the goods ready
    compile group: 'org.codehaus.groovy', name: 'groovy-all', version: '2.4.9'
    compile group: 'org.apache.ivy', name:'ivy', version:'2.4.0'

    // The production code uses the SLF4J logging API at compile time
    compile 'org.slf4j:slf4j-api:1.7.21'

    // Declare the dependency for your favourite test framework you want to use in your tests.
    // TestNG is also supported by the Gradle Test task. Just change the
    // testCompile dependency to testCompile 'org.testng:testng:6.8.1' and add
    // 'test.useTestNG()' to your build script.
    compile 'junit:junit:4.12'
    // ======= jenkins pipeline unit testing framework ====== //
    compile group:'com.lesfurets', name:'jenkins-pipeline-unit', version:'1.0'
     * For Spock unit tests (not needed when just using JUnit)
    testCompile 'org.spockframework:spock-core:1.1-groovy-2.4'
    testCompile 'cglib:cglib-nodep:3.2.2'
    testCompile 'org.objenesis:objenesis:1.2'
    testCompile 'org.assertj:assertj-core:3.7.0'

    // ============== base of jenkins =============== //
    // (this list may grow, just try to minimize libs coming in):
    compile('org.jenkins-ci.main:jenkins-core:2.69') {
        transitive = false
    compile('org.kohsuke.stapler:stapler:1.251') {
        transitive = false
    // ========= parts of jenkins plugins ============ //
    // might by used in pipeline scripts (note the @jar!):
    compile 'org.jenkins-ci.plugins.workflow:workflow-step-api:2.12@jar'

    // ========= PIPELINE COMPONENT DEPENDENCIES ========//
    // since ivy @Grab is not emulated, we need to include them this way
    compile 'org.jfrog.artifactory.client:artifactory-java-client-services:0.16'

// make sure groovy compiler has all the dependencies before! starting compiling (@Grab needs ivy for instance)
tasks.withType(GroovyCompile) {
    groovyClasspath += configurations.compile

Running the tests

Since gradle can be started using the brilliant gradle wrapper, starting the build from Jenkins is easy.

It’s enough to create a typical ‘multi-branch job’ and have a Jenkinsfile with the following:

sh "./gradlew verify"

Now each push to the shared library repository will start a test run producing early warnings if something is wrong.


More info


Chasing tails

The recently developed option to test pipeline code is great. As with all testing options though it’s important to keep in mind whether the extra testing effort is worth it.

Having Jenkins run tests on test code that runs tests seems kind of pointless, after all tests will (hopefully) fail both when the production code is broken and when the tests are broken. It turns out that having some trivial unit tests for pipeline code does deliver a lot of value by creating a very short feedback loop. In terms of knowing whether the pipeline actually does the ‘right’ thing is a different matter. Tests that can assert this are far more complicated, true system tests are probably more suitable for that purpose.


Some things to think about:

  • what is continuous passing style?
  • what would be a convenient strategy to actually integration-/system-test new pipeline code?