December 8, 2016

Day 8 - Building Robust Jenkins Pipelines

Written By: Michael Heap (@mheap)
Edited By: Daniel Maher (@phrawzty)

For many years continuous integration was synonymous with the name Jenkins, but as time went on it fell out of favour and was replaced by newer kids on the block such as Travis, GoCD and Concourse. Users became frustrated with Jenkins’ way of defining build jobs and looked to services that allowed you to define your builds as code alongside your project, building real pipelines with fan-in/fan-out capabilities.

Jenkins recently hit version 2, and with it came a whole host of new features! There were some small additions to the base install (such as the git plugin being shipped by default), but true to its roots, Jenkins still ships most of it’s functionality as plugins. Functionality such as defining your job configuration in code is provided by plugins that are maintained by the Jenkins core team, which means they’re held to the same high standards as Jenkins itself.

In this article we’ll be building an Electron application and adding it to Jenkins. Whilst we’ll be using the Github organization plugin, it’s not essential - you can configure repositories by hand if you prefer.

The majority of the code in this article is simple calls to shell commands, but a passing familiarity with either Java or Groovy may help later on.

Getting Started

In this article, we’re going to be building a pipeline that builds, tests, and packages an Electron application for deployment on Linux. We’ll start with a basic Jenkins installation and add all of the plugins required, before writing a Jenkinsfile that builds our project.

Let’s create a Vagrant virtual machine to run Jenkins inside. This isn’t a requirement to run Jenkins, but helps us get up and running slightly easier.

Creating a VM

vagrant init ubuntu/xenial64
sed -i '/#.*private_network*/ s/^  #/ /' Vagrantfile # Enable private networking
vagrant up
vagrant ssh
wget -q -O - https://pkg.jenkins.io/debian/jenkins-ci.org.key | sudo apt-key add -
sudo sh -c 'echo deb http://pkg.jenkins.io/debian-stable binary/ > /etc/apt/sources.list.d/jenkins.list'
sudo apt-get update
sudo apt-get install -y jenkins
sudo cat /var/lib/jenkins/secrets/initialAdminPassword
exit

Visit http://192.168.33.10:8080 on your local machine and continue the setup.

Bootstrapping Jenkins

You’ll need two plugins to get started: the Pipeline plugin and the GitHub Organization Folder plugin. These form the cornerstone of your Jenkins install. Click on None at the top then type pipeline into the search box. Select Pipeline and GitHub Organization Folder before clicking Install.

You’ll be prompted to create a user once the plugins are installed. Do this now, then click on Start using Jenkins to get underway.

Configuring Jenkins

Now that you’re logged in, there’s a little bit of work to do to configure Jenkins so that we can start using it. We’ll need to provide access credentials and set up our GitHub Organization job so that Jenkins knows what to build. Sadly, we need to do this even if we’re only working with public repositories as certain endpoints on the GitHub API require authentication.

  • Click Credentials on the left hand side.

  • Click on (global) next to the Jenkins store.

  • Click Add Credentials on the left.

  • Decide what kind of credentials you want to add. I’m using username & password authentication for HTTPS clones from GitHub, so I visited https://github.com/settings/tokens and generated a token that had only the repo OAuth scope.

  • Provide the credentials by filling in the username and password fields. (Note that the fields are different if you’re using SSH authentication.)

  • Click Jenkins in the top left to go back to the homepage.

  • Click New Item on the left.

  • Choose GitHub Organization and give it a name (I’ve called mine michael-test), then click OK.

  • Under Repository sources, set the Owner field to be either your organisation name or your personal account name.

  • Select the credentials you just set up using the Scan Credentials option.

  • Click Advanced under Repository name pattern and make sure that Build origin PRs merged with base branch is enabled. This will make Jenkins build any incoming pull requests.

It’s worth noting at this point that under the Project Recognizers section it says Pipeline Jenkinsfile. Jenkins will scan an organisation or account for any repos with branches that contain a “Jenkinsfile” at their root and create a build job for that branch.

Let’s create your first Jenkinsfile and kick off a build.

Writing your first Jenkinsfile

We need a project to build, so let’s create one. For now it’ll be an empty project that only contains a Jenkinsfile.

mkdir demo-electron
cd demo-electron
git init
touch Jenkinsfile

Edit Jenkinsfile and add the following content:

echo "Hello World"

Commit and push this up to GitHub (you’ll need to create a repository first). I’ve called my repo demo-electron.

git add .
git commit -m "Initial Jenkinsfile"
git remote add origin git@github.com:michael-test-org/demo-electron.git
git push -u origin master

Go back to Jenkins, and you’ll see an option on the left called Re-scan organization. Click it, then click on the Run now option that appears. You’ll see some log output where it detects the Jenkinsfile and creates a job for you.

Within michael-test there’s now a job called demo-electron, which matches our repository name. Clicking through, we can now see a master job within. By default, Jenkins will build all existing branches. Click into the master job and then click on #1 on the left hand side to see the results from the first build that was performed when the repository was detected. If you click on Console Output on the left, you’ll see Jenkins cloning our repo before outputting “Hello World”.

Congratulations, you just wrote your first Jenkinsfile! It doesn’t do much, but it proves that our Jenkins instance is up and running, detecting pipelines, and automatically executing them.

Throughout the rest of this post, we’ll be building a small Electron based desktop application. This means that our build machine will need some additional dependencies installed. If you want to work along with this tutorial, your Vagrant machine will need a little bit of configuration. Log in to your machine with vagrant ssh and run the following command to install the necessary dependencies to build an Electron application:

sudo apt-get install -y nodejs nodejs-legacy npm jq

Adding Electron

We’re going to add Electron to the project at this point. Instead of working directly on master, we’re going to work on a new branch called add-electron.

Create a new branch to work on add-electron by running the following command:

git checkout -b add-electron

The first thing we need to do is create a standalone Electron application. To do this, we follow the Electron Quick Start document. Go ahead and follow that - once you’re done you should have a package.json, main.js and index.html. We also need to install Electron as a dependency, so run the following command on your local machine in the same directory as package.json.

npm install electron --save-dev

This will make Electron available to run our app. You can test it by running ./node_modules/.bin/electron . in the same folder as your main.js file. This will show a “Hello World” page.

This gives us our first real action to run in Jenkins! Although running npm install doesn’t feel like a lot, ensuring that we can install our dependencies is a great first step towards continuous builds.

Jenkinsfile

It’s time to change our Jenkinsfile! It’ll be a nice simple one to start with: first make sure that we’re on the right branch, then run npm install.

The first thing to do is delete our “Hello World” line and add the following contents:

node {
  
}

This declaration tells Jenkins that anything inside the braces must run on a build agent. Our old echo statement could run without a workspace (a copy of our repository on disk) but as we’re about to interact with the repository we need a workspace.

The next thing to do is to tell Jenkins to make sure that our repository is in the correct state based on the job that we’re running. To do this, we run a special checkout scm command that makes sure we’re on the correct branch. Finally, we run npm install using the sh helper to run a shell command. Our entire Jenkinsfile looks like this after the changes:

node {
  checkout scm
  sh 'npm install'
}

If we commit and push this, Jenkins will build the project using our new Jenkinsfile and run npm install each time we commit.

git add Jenkinsfile index.html main.js package.json
git commit -m "Create Electron app"
git push origin add-electron

While it is possible for Jenkins to be notified from GitHub each time there is a commit to build, this won’t work for us as we’re testing with a local Jenkins instance. Instead, we need to tell Jenkins to search for changes by visiting the main page for our job and clicking on Branch Indexing then Run Now. Jenkins will then rescan the repo for new branches, add our add-electron branch, then run the new Jenkinsfile.

Just like we could view the build for the master branch, we can click in to add-electron, click on Console Output on the left and watch as Jenkins runs npm install for us.

At this point, we can raise a pull request to merge our branch into master on GitHub. Once Jenkins rescans the repository (manually triggered, see above), it will detect this pull request and automatically update the pull request’s commit status with the outcome of the build.

Once the request is merged and our add-electron branch is deleted, Jenkins will go back to just building the master branch on its own.

Creating a build

We now have an application that’s building, but we need a way to package it up and ship it to customers. Let’s create a new working branch called build-app by running the following command:

git checkout -b build-app

There’s a project called electron-builder which can help us build our application. Just like with Electron, we need to require it in our package.json by running the following on your local machine in the same directory as package.json:

npm install electron-builder --save-dev

Next, we need to add some details to our package.json for the builder to use, including configuration values such as our application’s description and application identifier. We’ll also add a few npm scripts that build the application for different operating systems.

Update your package.json to like the following, which contains all of the required information and build scripts:

{
  "name": "sysadvent",
  "version": "0.1.0",
  "main": "main.js",
  "devDependencies": {
    "electron": "^1.4.10",
    "electron-builder": "^10.4.1"
  },
  "description": "SysAdvent",
  "author": "Michael Heap <m@michaelheap.com>",
  "license": "MIT",
  "build": {
    "appId": "com.michaelheap.sysadvent",
    "category": "app.category.type"
  },
  "scripts": {
      "start": "electron main.js",
      "build": "build --mwl --x64 --ia32",
      "build-windows": "build --windows --x64 --ia32",
      "build-mac": "build --mac --x64 --ia32",
      "build-linux": "build --linux --x64 --ia32"
  }
}

At this point, you can test the build on your local machine by running npm run build-<os>; in my case I run npm run build-linux and the build applications show up in the dist folder. I can run ./dist/sysadvent-0.1.0-x86_64.AppImage to run the application. If you’re on OSX, there should be a .app file in the dist folder. If you’re on Windows there should be an installer .exe

This is another huge step in our build process. We now have a tangible build that we can distribute to people! Let’s update our build process so that it builds the application. Edit your Jenkinsfile and add a call to npm run build-linux, as we’re doing to be building on a Linux machine.

node {
    checkout scm
    sh 'npm install'
    sh 'npm run build-linux'
}

This is almost enough to have our build automated, but we somehow have to get the build artefacts out of our workspace and make them available for people to download. Jenkins has built in support for artefacts, which means there’s a built in archiveArtifacts step that we can use. We tell Jenkins to step in to the dist folder with the dir command and tell it to archive all artefacts that end with .AppImage in that directory. Your final Jenkinsfile will look like the following:

node {
    checkout scm
    sh 'npm install'
    sh 'npm run build-linux'
    dir('dist') {
      archiveArtifacts artifacts: '*.AppImage', fingerprint: true;
    }
}

If we commit and push this up to GitHub and retrigger the branch indexing, Jenkins will pick up the new branch, build our application, and publish an artefact.

git add Jenkinsfile package.json
git commit -m "Build packaged application"
git push origin build-app

Once the branch has built and is green, we can raise a pull request and get our changes merged into master. At this point, we have an application that has a build process and creates a packaged application that we can send out to users.

A more complex Jenkinsfile

What we’ve put together so far is a good introduction to building applications with Jenkins, but it’s not really representative of a real world project that would contain tests, code style linting, and various other build steps.

On the full-application branch, you’ll find a build of the Electron application that has basic functionality added, tests for that functionality, and code style linting added. With the addition of these common steps, the Jenkinsfile is already starting to grow.

node {
    checkout scm
    sh 'npm install'
    sh 'npm test'
    sh 'npm run lint'
    sh 'npm run build-linux'
    dir('dist') {
      archiveArtifacts artifacts: '*.AppImage', fingerprint: true;
    }
}

It’s still relatively easy to understand this Jenkinsfile, but imagine that we’ve copied and pasted it in to half a dozen different applications that need to build an Electron app for distribution. If we wanted to update our application and offer an OS X build too, we’d have to update every project and edit every Jenkinsfile one by one. Fortunately, Jenkins has a solution for this too: the Jenkins Global Library.

Creating a Global Library

When writing software applications we frown upon copy and pasting code between files, so why is it accepted when building deployment pipelines? Just like you’d wrap your code up in a library to reuse in your application, we can do the same with our Jenkins build steps.

Just as the Jenkinsfile is written in the Groovy language, our global library will be written in Groovy too. Groovy runs on the JVM, so the namespace structure is very similar to Java. Create a new folder and bootstrap the required files:

mkdir global-lib
cd global-lib
mkdir -p src/com/michaelheap
touch src/com/michaelheap/ElectronApplication.groovy

The ElectronApplication file is where our build logic will live. Take everything in your Jenkinsfile and paste it in to ElectronApplication.groovy inside an execute function:

#!/usr/bin/env groovy

package com.michaelheap;

def execute() {
  node {
    checkout scm
    sh 'npm install'
    sh 'npm test'
    sh 'npm run lint'
    sh 'npm run build-linux'
    dir('dist') {
      archiveArtifacts artifacts: '*.AppImage', fingerprint: true;
    }
  }
}

return this;

Then update your Jenkinsfile so that it calls just this file. Groovy files are automatically compiled into a class that has the same name as the file, so this would be available as new ElectronApplication(). Your Jenkinsfile should look like the following:

def pipe = new com.michaelheap.ElectronApplication()
pipe.execute()

Once we update all of our applications to run this class rather than having the same thing in multiple places, any time we need to update the build pipeline we only need to update it in our Global Library and it will automatically be used in the next build of any job that runs.

There is one final step we need to perform before Jenkins can start using our global library. We need to publish it to a repository somewhere (mine’s on GitHub and load it into the Jenkins configuration. Click on Manage Jenkins on the homepage and then Configure System before scrolling down to Global Pipeline Libraries.

To make Jenkins load your library, follow these steps:

  • Click Add.
  • Provide a name.
  • Set the default version to master.
  • Make sure Load Implicitly is checked, so that we don’t need to declare @Library in every Jenkinsfile.
  • Click Modern SCM.
  • Enter your organisation/account username into the Owner box.
  • Select the credentials to use for the scan.
  • Select your Global Library repository.
  • Click Save at the bottom.

The next time any job that references our new com.michaelheap.ElectronApplication definition runs, the global library will automatically be downloaded and imported so that any functions defined in the file can be used. In this case we call execute which runs everything else we needed.

Making the library DRY

Having a Global Library is a huge step towards re-usability, but if we ever needed to build an Electron application that didn’t have any tests to run, or it needed some additional steps running (or even the same steps in a different order) we’d need to copy and paste our ElectronApplication definition and make the changes in a new file. Isn’t this what we were trying to get away from?

Fortunately, I found an awesome example of how you can build your jobs as a pipeline of steps to be executed in the TYPO3-infrastructure project on Github. It’s quite tough to explain, so instead let’s work through an example. Let’s take our existing ElectronApplication and break it down in to five different steps:

  • npm install
  • npm test
  • npm run lint
  • npm run build-linux
  • archiveArtifacts

Each of these is a command that could be run independently, so instead of having them all in a single file, let’s give them each their own class each by creating some new files:

cd src/com/michaelheap
touch Checkout.groovy InstallDeps.groovy Test.groovy Lint.groovy Build.groovy ArchiveArtifacts.groovy

We move each command out of ElectronApplication and in to an execute function in each of these files. It’s important to ensure that they’re in the correct package namespace, and that they’re inside a node block, as we need a workspace:

InstallDeps.groovy

package com.michaelheap;
def execute() {
  node {
    sh 'npm install'
  }
}

ArchiveArtifacts.groovy

package com.michaelheap;
def execute() {
  node {
    dir('dist') {
      archiveArtifacts artifacts: '*.AppImage', fingerprint: true;
    }
  }
}

At this point, your ElectronApplication file will look pretty empty - just an empty execute function and a return this. We need to instruct Jenkins which steps to run. To do this, we’ll add a new run method that tries to execute a step and handles the error if anything fails:

#!/usr/bin/env groovy

package com.michaelheap;

def run(Object step){
    try {
        step.execute();
    } catch (err) {
        currentBuild.result = "FAILURE"
        error(message)
    }
}

def execute() {
}

return this;

Finally, we have to fill in the execute method with all of the steps we want to run:

def execute() {
    this.run(new Checkout());
    this.run(new InstallDeps());
    this.run(new Test());
    this.run(new Lint());
    this.run(new Build());
    this.run(new ArchiveArtifacts());
}

Once we commit all of our changes and new files to Github, the next time an ElectronApplication pipeline runs, it’ll use our new code.

For example, if we ever needed to set up a pipeline that automatically tested and published new modules to NPM, we wouldn’t have to reimplement all of those steps. We may have to create a new Publish.groovy that runs npm publish, but we can reuse all of our existing steps by creating an NpmModule.groovy file that has the following execute function:

def execute() {
    this.run(new Checkout());
    this.run(new InstallDeps());
    this.run(new Test());
    this.run(new Lint());
    this.run(new Publish());
}

Once that’s added to our global library, we can use it in any project by adding a Jenkinsfile with the following contents:

def pipe = new com.michaelheap.NpmModule();
pipe.execute();

This will reuse all of our existing steps alongside the new Publish step to test, lint, and publish an NPM module.

Conditional builds

One of the awesome things about the GitHub Organization plugin is that it automatically detects new branches and pull requests and runs builds for them. This is great for things like testing and linting, but we don’t want to publish every branch we create. Generally, we want to run all of our tests on every branch but only publish from the master branch. Fortunately, Jenkins provides the branch name as an environment variable called env.BRANCH_NAME. We can therefore add a conditional in NpmModule.groovy so that we only publish when the pipeline is running against master:

def execute() {
    this.run(new Checkout());
    this.run(new InstallDeps());
    this.run(new Test());
    this.run(new Lint());
    if (env.BRANCH_NAME == "master") {
      this.run(new Publish());
    }
}

This works great for teams that are working towards a continuous delivery goal, but in the real world we do sometimes have to deploy from other branches too - whether it’s a legacy branch that receives security fixes or multiple active versions. Jenkins lets you do this too - the Jenkinsfile is just code after all!

If we wanted to publish everything on master, but also everything where the branch name starts with publish- we could change our if statement to look like the following:

if (env.BRANCH_NAME == "master" || (env.BRANCH_NAME.length() >= 8 && env.BRANCH_NAME.substring(0, 8) == "publish-")) {
  this.run(new Publish());
}

It’s a bit long, but it gets the job done. Now, only commits to either master or publish-* will be published.

Using 3rd Party Plugins in a Jenkinsfile

The final part of this post is a section on calling 3rd party plugins from your Jenkinsfile. Plugins may specifically support the Jenkinsfile (you can find a list here), in which case they’re nice and easy to use. Take the Slack plugin for example - it’s been updated with Jenkinsfile compatibility, so it’s as simple as calling slackSend "string to send" in your Jenkinsfile.

Sadly, not all plugins have been updated with Jenkinsfile friendly syntax. In this situation, you need to know which class you want to call and use the special step method. The ws-cleanup plugin is one that I use all the time that hasn’t been updated, so I have to call it via step([$class: 'WsCleanup']). ws-cleanup doesn’t accept any parameters, but you can pass parameters via the step method. For example, the JUnitResultArchiver can be called via step([$class: 'JUnitResultArchiver', testResults: '**/target/surefire-reports/TEST-*.xml']) (as seen in the pipeline plugin documentation).

If we wrap these calls to step up in a custom step like we did with InstallDeps, Test etc then we can start working with the abstractions in our build pipelines. If the plugin ever updates to provide a Jenkinsfile friendly interface we only have a single place to edit rather than dozens of different projects.

WorkspaceCleanup.groovy

package com.michaelheap;
def execute() {
  node {
    step([$class: 'WsCleanup'])
  }
}

Hopefully you won’t find too many plugins that aren’t pipeline-friendly. As more and more people are starting to use the Jenkinsfile, plugin authors are making it easier to work with their addons.

There’s more!

We’re only scratching the surface, and there’s so much more that Jenkins’ new pipeline support can do; we didn’t even get around to building on slave machines, running jobs in parallel, or stage support! If you’d like to learn more about any of these topics (or Jenkinsfiles in general) feel free to ask! You can find me on Twitter as @mheap or you can email me at m@michaelheap.com.

If you’re doing something really cool with Jenkins and the Jenkinsfile, I’d love to hear from you too. I bet I could definitely learn something from you, too!

No comments :