By clicking “Accept All Cookies”, you agree to the storing of cookies on your device to enhance site navigation, analyze site usage, and assist in our marketing efforts. View our Privacy Policy for more information.
Resources
>
Blog
>
Article
The Jenkins Blue Ocean logo.
February 21, 2017

Blue Ocean Pipeline Automation

Blue Ocean 1.0.0 released in early April 2017 and it looks great. There has been some criticism from the community regarding the system, but we think Blue Ocean has come a long way and is continuing to grow.

Blue Ocean 1.0.0 released in early April 2017 and it looks great. There has been some criticism from the community regarding the system, but we think Blue Ocean has come a long way and is continuing to grow. Admittedly, using Blue Ocean won’t come without some growing pains, but the overall benefits will help to make your software delivery better. Here, I’ll share some history, some of the main benefits of Blue Ocean, and some examples of software and infrastructure as code.

Pipelines of Old

We already have a pipeline; what’s wrong with it? Implementation and Visualization could be better.

Implementation of pipelines from their beginnings through today vary from project to project. Historically pipelines were created manually by an admin and these jobs would be tedious, prone to error, and – without proper backup – subject to loss if the server went down. If there are hundreds of jobs and something changes project-wide then the maintainer would be required to verify all changes occurred in every job. Implementation would be much simpler with a streamlined way to keep the configuration with the project files. This has been partially remedied with Groovy scripting and the Jenkinsfile plugin but it can always be better.

Visualizing a pipeline in Jenkins before Blue Ocean pipeline automation wasn’t aesthetically pleasing and required manual setup of the views. An admin could use the Build Pipeline Plugin which would present a visual representation of the pipeline but this took extra time. Developers may need to dive deep into folders to find the job they needed to see only to be met with the wall of text that the console outputs. Visualization could be improved by automatically generating views for users and helping to navigate through the messy log files.

Benefits

Blue Ocean is a project that is designed to enhance the user’s experience with the software. Some of the main benefits are:

  • Jenkinsfile and Pipeline Integration
  • Visualization of Pipelines
  • Focuses on what the current user is interested in
  • Logs that are easy to navigate.

Jenkinsfile and Pipeline Integration

It is true that standard Jenkins can use pipelines and Jenkinsfile but the native support built into Blue Ocean rises above. A Jenkinsfile is a configuration file that will define the pipeline for a particular project. This file has been designed to live alongside your code in a repo to create accountability and reliability for your configuration through source control. This close coupling of Jenkinsfile and repo allowed Blue Ocean to implement an easy way to scan via the “New Pipeline” option.

Creating a new pipeline in Blue Ocean

This option brings up the Create Pipeline view and includes a step-by-step instruction set to add a new pipeline to the list. If you use Github then you can choose the repository you want to use or have it scan the entire organization looking for Jenkinsfiles.

Options for where to create a new pipeline.

Otherwise you can set up a URL and credentials to access a repo elsewhere. Once completed the application will inform you that the new pipeline has been created and it is ready to view. An example Jenkinsfile can be found at the bottom of this document.

Adding a repository url and credentials to create the new pipeline.

Blue Ocean and Jenkinsfile have some great feedback for pipeline configuration. Getting an error an hour into a build because of a bad value or misspell is a thing of the past. Declarative Jenkinsfile will be linted at the start of a job and will fail if somethings is incorrectly added. Also, linting before committing is now a possibility using the Jenkins-cli. Both of these features add to benefits of pipeline as code and make building software less painful.

Blue Ocean showing 1 error at the start of the job.

Visualization of Pipelines and Projects

In 2017, UI/UX design isn’t just for consumers anymore. Blue Ocean out of the box has a more modern, streamlined appearance that makes it easy to work in. Aside from aesthetics the views are personalized to show your favorite projects, projects you work on, and all of their statuses. The user can focus on the items important to them. Their favorites and projects they work on are listed on top with the color coding to draw quick attention to what is broken. Easily select favorites by clicking the star on the righthand side of the page and, in good Jenkins fashion, the weather report is still available with newer icons.

Two views of the Same Instance by Different Users
Two views of the Same Instance by Different Users
Two views of the Same Instance by Different Users

Blue Ocean also includes a new pipeline view that shows the steps the project takes through the pipeline. The main sections of the pipeline are broken into the stages in the project’s Jenkinsfile. If the stage has any parallel steps then those are shown like a stack. If a stage fails then it is clearly outlined by the view and you can dig further into the issue by selecting the failing step to see the log.

Blue Ocean pipeline view broken down by stages.
Blue Ocean pipeline view broken down by stages, with Testing stage showing failure.

Easy-to-Read Logs

Speaking of logs, most people hate having to dig through huge log files to determine what went wrong with a build or test. Blue ocean breaks up the console output into the same sections that the build pipeline view shows. On top of that, those same sections may have multiple steps and each step has it’s own break in the console output.

The console log readout of the failed testing stage.

Blue Ocean Pipeline Automation: Basic Examples

Now you see some of the features of Blue Ocean pipeline automation but how about actually using them? Moving to Jenkinsfile and pipeline-as-code takes some serious thought and effort but it is worth it in the long run. It is worth noting that there are two types of Jenkinsfile, Scripted and Declarative. These examples are both Declarative.

Here are two basic examples to look at:

Node.js example

Node.js is a popular Javascript runtime that is used in projects all over. If you are here after viewing the Linux Foundation’s Intro to Continuous Delivery course then you are already familiar with the project Dromedary. If you aren’t, Dromedary is a demo application by Stelligent (Thanks!).  You are welcome to spin up your own instance of Jenkins with all of the Blue Ocean plugins to try this.

Let’s dive into the Jenkinsfile.

  1. agent
  1. The agent is the workspace that the
  2. This build is using a Dockerfile that is placed in the top level of the software’s repository. This Dockerfile takes the Nodejs image and creates a new image included Gulp.
  1. Stages
  1. We created an initialize stage to install the node_modules necessary for testing.
  2. Unit testing using gulp
  3. Convergence testing (No convergence testing here, just created for the sake of example)
  1. Parallelized builds will all run as a stage
  2. You can take this parallelized steps and spread them out over different agents to save on time and configuration
  1. Build using gulp. In your own software you can do what you see fit.
  2. Deploy
  1. Deploy the application with you preferred option.
  2. Some options are Heroku, AWS, an Artifact repo, etc.
The dromedary demo application in pipeline view.

Chef Infrastructure Example

Liatrio’s infrastructure code also goes through CI. Historically we have used Chef but this methodology can be used with whatever Infrastructure tool assuming testing is possible.

This example is straightforward. Let’s take a look:

  1. Agent for this pipeline is any. This will use the master agent in this case.
  2. Stages are similar to the previous example.
  1. The Setup step prints the version for log verification if need be as well as getting the dependencies with Berkshelf.
  2. Acceptance testing in parallel to easily see what failed if it did.
  3. Test Kitchen is a step that purposely fails for the sake of the description. It runs convergence (integration) testing of the cookbook.
  1. Post – Runs after the build is complete.
  1. Success – will print in this example but you can send messages, slack, etc on a successful build
  2. Failure – will print on a failed build but you can send emails, slack messages, etc on a failed build.
Liatrio's demo application failed on Test step of the pipeline.

Final Thoughts

The above examples are trivial at best, but they show options and capabilities. This is a new take on Blue Ocean pipeline automation, and there will be growing pains. However, we hope we’ve shown that it’s worth it.

If you have any comments or questions, reach out to us.


Latest From Our Blog

who is liatrio?

the Enterprise Modernization Consultancy

Enterprises need consulting that will actually challenge them, drive them to succeed, and own their own destiny.

Modern challenges require more than just tools and technology — It's about evolving how you operate, think, and deliver. Our unique combination of modern engineering talent combined with transformational practices enables enterprises to achieve long term success in their digital transformation journeys. That's why we're not just any DevOps Consultancy — it's why we're THE Enterprise DevOps Consultancy.

Ready to get started?

Contact Us

We'd love to learn more about your project and determine how we can help out.