Modern DevOps with Bitbucket Pipelines

Modern DevOps with Bitbucket Pipelines

Continuity in modern software development allows for frequent and consistent product release cycles guaranteeing your business ability to react to market changes whilst keeping your team productive.

In this session we’ll take a closer look at the guiding principles of continuous integration, continuous delivery and continuous deployment and highlight their key differences and their application using Bitbucket Pipelines.

bitbucket-pipelines.ymlbitbucket-pipelines.yml
1
2
3
4
5
6
image: node:10.15.0
pipelines:
default:
- step:
script:
- node -v

Overview of DevOps Culture

DevOps combines the operation tools and agile engineering practices to encourage collaboration between development and operations teams.

DevOps is driven by shared responsibility and increased collaboration between teams.

A development team that shares responsibility with the operations team can come up with simplified ways of managing deployments and monitoring production services for greater feedback.

The operations team can also work closely with the development team to understand the operational needs for a system and adoption of new tools.

DevOps

DevOps culture blurs the lines between various teams, a common abuse of this culture is creating a DevOps Team .

Another core driving factor in DevOps is autonomy in teams. Developers and Ops team need to be able to make decisions and apply changes without having to go through a complex decision-making process.

At the core of DevOps is Automation. Automation removes chances of human error that may creep in during testing, configuration management and deployment scenarios.

DevOps pipeline

graph LR
  subgraph Continuous Integration
        delbuild[Build]-.-> deltest[Test]
       depbuild[Build]-.-> deptest[Test]
    end
  subgraph Continuous Delivery
   deltest-.->delaccept[Review]
   delaccept-.->delstage[Staging]
   delstage-->delprod[Deploy]
   delprod-->delsmoketest[Test]
   end
 subgraph Continuous Deployment
   deptest-.->depaccept[Review]
   depaccept-.->depstage[Staging]
   depstage-.->depprod[Deploy]
   depprod-.->depsmoketest[Test]
 end

Continuous Integration

Continuous Integration advocates for automated building and testing before merge with the main branch. Depending on the branch work-flow, the code will have to be merged with the parent branch from which it was branched out of.

gitGraph:
options
{
    "nodeSpacing": 100,
    "nodeRadius": 10
}
end
commit
branch branch1
checkout branch1
commit
checkout master
commit
merge branch1

Continuous Delivery

Continuous Delivery leverages off of automated tests. In this scenario, the developer has a consistent pipeline to deploy their application as regularly as possible. It is always encouraged to regularly do small releases and monitor feedback for quicker troubleshooting. However, this practice normally requires human intervention.

gantt
dateFormat  YYYY-MM-DD
title Ship Feature
section Sprint
V1.0.0            :done,    des1, 2019-01-06,2019-01-08
V1.0.1                 :active,  des2, 2019-01-09, 3d
Deploy                :         des3, after des2, 5d
Feedback                :         des4, after des3, 5d

Continuous Deployment

In Continuous Deployment, changes merged with the production branch are packaged and deployed without human intervention and in essence cutting release time from days to minutes.

sequenceDiagram
    participant CI/CD Server
    participant VM
    CI/CD Server->>VM: Build artifact
    loop New Build
        CI/CD Server->>CI/CD Server: build
    end

Bitbucket Pipelines

Bitbucket Pipelines is a CI/CD service that ships with Bitbucket that enables automated building, testing, and deployment from within Bitbucket.

The underlying magic is the usage of containers to build and package artifacts. We can use our own images to run builds. You can checkout out an overview of how building applications work.

At the core Bitbucket Pipelines is bitbucket-pipelines.yml file. This is a configuration-as-code file that is versioned along with your code.

Let’s dive into the file itself and dissect the bits that facilitate build.

Each pipeline is split into steps and the maximum number of steps is currently at 10.

Each step runs commands in a separate container instance, this implies that one can use different Docker images for different steps.

A pipeline is run depending on the context. Currently there 5 supported contexts. Only one context can be run at a time, depending on the commit.

Contexts are defined in the pipelines section of the bitbucket-pipelines.yml and must have unique names, otherwise the default context is run.

graph TB
  subgraph pipelines
     subgraph context
          subgraph steps
          script[Build]-.->script2[Test]
          script2-.->script3[Binary]
         script3-.->script4[Deploy]
         end
      end
     end
  • default : All commits are run under this context unless the commit match any of the following pipeline contexts.

  • branches : This context, if set, runs pipelines commits that match the specified branch.

    Branching Context
  • tags / bookmarks : This context runs pipelines for all commit tags that match a tag/bookmark pattern.

  • pull-requests : This pipeline context runs when there’s a pull request to the repo branch.

  • custom : This a manually triggered pipeline.

bitbucket-pipelines.ymlbitbucket-pipelines.yml
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
image: python:3.7
definitions:
services:
mongo:
image: mongo
pipelines:
default:

- step:
name: Testing
caches:
- pip
script:
- pip install pymongo
- python -m unittest discover tests/
services:
- mongo

Let’s take a look at some of the pipeline definitions. Take the following sample for reference.

bitbucket-pipelines.ymlview raw
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
options:
max-time: 10 # The build should not exceed 10 minutes
docker: true # We'll be using docker later.. Have it enabled
image: node:10.15.0-alpine
pipelines:
default: # (Unfortunately does not work once you enable branches)
- step:
caches:
- node
- docker # https://bitbucket.org/site/master/issues/14144/cache-docker-layers-between-builds
script:
- docker login --username $DOCKER_USERNAME --password $DOCKER_PASSWORD
# - Maybe install Coala - but requires a python build env maybe we could check
- chmod +x haspython.sh && ./haspython.sh
- npm install netlify-cli snyk -g # Ensure Snyk and Netlify CLI is installed
- snyk auth ${SNYK_TOKEN} -d # Authenticate Snyk
- netlify login # Authenticate Netlify
# For all commit we want to ensure we have vulnerablities tested... And report low|medium|high vulnerabilities
- yarn

pull-requests:
'**': # For any other branch on PR... PR from develop branch
- parallel:
- step:
caches:
- node
name: Test Vulnerabilities
script:
- npm install snyk -g
- snyk auth ${SNYK_TOKEN} -d # Authenticate Snyk
- snyk protect
#- yarn test
- step:
name: Lint Markdown
caches:
- node
script:
- export NODE_ENV=dev
- yarn install --production=false
- yarn run lint-md

branches:
develop:
- parallel:
- step:
caches:
- node
name: Vulnerabilty Test
script:
- npm install snyk -g
- snyk auth ${SNYK_TOKEN} -d # Authenticate Snyk
- snyk protect
# - yarn test
#- snyk monitor
- step:
caches:
- node
- docker
name: Markdown Lint and Copy Over
script:
- export NODE_ENV=dev
- yarn install --production=false
- yarn run lint-md
- yarn run clean
- yarn build
- pipe: atlassian/scp-deploy:0.3.0
variables:
USER: "root"
SERVER: "1.2.3.4"
REMOTE_PATH: "/usr/src/blog/"
LOCAL_PATH: "public"
SSH_KEY: $SOME_KEY # Get this from ENV variables
DEBUG: "true"
EXTRA_ARGS: "-o ServerAliveInterval=10"
- step:
name: Create Image
caches:
- docker # https://bitbucket.org/site/master/issues/14144/cache-docker-layers-between-builds
script: # Remove extra step and copy file to binary using artifacts
- docker login --username $DOCKER_USERNAME --password $DOCKER_PASSWORD
- docker build -t $DOCKER_USERNAME/$BITBUCKET_REPO_SLUG:$BITBUCKET_BUILD_NUMBER .
- docker push $DOCKER_USERNAME/$BITBUCKET_REPO_SLUG:$BITBUCKET_BUILD_NUMBER
master:
- parallel:
- step:
name: Vulnerabilty Test
caches:
- node
script:
- npm install snyk -g
- snyk auth $SNYK_TOKEN -d # Authenticate Snyk
- snyk protect
#- yarn test
#- snyk monitor
- step:
name: Markdown Lint
caches:
- node
script:
- export NODE_ENV=dev
- yarn install --production=false
- yarn run lint-md
- step:
name: Clean and build
caches:
- node
script:
- export NODE_ENV=production
- yarn install --production=true
- yarn run clean
- yarn build
artifacts: # Share these with the next build stage
- public/**
- step:
name: Deploy to Netlify
trigger: manual
caches:
- node
script:
- npm install netlify-cli -g
- netlify deploy --dir=public --prod

The script haspython.sh is as shown. It just checks for the Python interpreter and version.

haspython.shview raw
1
2
3
4
5
6
7
8
9
10
11
12
#!/bin/bash

if ! hash python; then
echo "Python is required to run some of these tests"
exit 1
fi

pyver=$(python -V 2>&1 | sed 's/.* \([0-9]\).\([0-9]\).*/\1\2/')
if [[ "$pyver" -lt "27" || "$pyver" -gt "30" ]]
then
echo "This script requires Python 2.7.X installed. Found Python " "$pyver"
fi

options : These are global settings that apply to the whole pipeline.

image : Defines the docker image to be used through-out the pipeline. One can override this per-step or per definition.

pipelines : defines the pipeline and the steps to run. Should always have a definition.

default : This will execute all defined pipeline commands except Mercurial bookmarks, tags and branch-specific pipeline definitions.

step : This is build execution unit with the following properties:

  • image : The Docker image to run
  • max-time : Maximum number of minutes that a step can run
  • caches : The cached dependencies to use
  • services : Separate docker containers to use during build
  • artifacts : The files to be shared between steps
  • trigger : Defines whether a step will require human intervention or should run automatically. Default is automatic. Note that the build might be stopped to facilitate manual processes.

parallel : This enables you to run multiple steps at the same time

Parallel builds

script : These are the commands to execute in sequence. Large scripts should be executed from script files.

branches : This section defines branch-specific builds commands to your pipelines. This will always override default unless otherwise. One can use glob patterns to capture matching branches.

pull-requests : This executes build commands to be run on pull-request. Initially, the changes will be merged first before the PR pipeline starts. If merge fails, the pipeline will stop. Forks don’t trigger pipelines.

pipes : New features, aims at simplify manual tasks by just simply supplying variables.

Bitbucket Tips and Tricks

  • To prevent unnecessary triggering of pipelines, use [skip ci] or [ci skip] as part of your commit message

    Skip CI
  • Use YAML anchors to avoid repetition in your pipelines configuration.

  • You can speed up builds immensely by using services, this avoids having to spin up multiple Docker images per step

  • Read the docs

Practical

Comments

Your browser is out-of-date!

Update your browser to view this website correctly. Update my browser now

×