Showing posts with label pipeline. Show all posts
Showing posts with label pipeline. Show all posts

Saturday, 6 March 2021

Terraform CD Pipeline Setup to Deploy App Servers

Overview

The goal is to implement Devops best practices to run Terraform in Jenkins Pipelines. We will go over the main concepts that need to be considered and a Jenkinsfile that runs Terraform. The Jenkinsfile will consists of parameters that allows us to pass data as variables in our pipeline job. 


  • Install Terraform on Jenkins Server
  • Install Terraform Plugin on Jenkins
  • Configure Terraform
  • Store and Encrypt Credentials in Jenkins
  • Setting up CD Pipeline with Terraform to Deploy App Servers
  • Run Pipeline Job


Install Terraform on Jenkins Server


Use the following commands to install Terraform on Jenkins server and move the binaries to the correct path as shown below.


  • wget https://releases.hashicorp.com/terraform/0.12.24/terraform_0.12.24_linux_amd64.zip
  • unzip terraform_0.12.24_linux_amd64.zip
  • sudo mv terraform /usr/bin/

Install Terraform plugin on Jenkins

Go to Manage Jenkins > Manage Plugins >Available > search Terraform as shown below:

As you can see, Terraform Plugin is already installed on my Jenkins hence why it's displayed in the Installed section.

Store and Encrypt Credentials in Jenkins (Access and Secret Key) 

In this step, we will be storing and encrypting the access and secret key in Jenkins to maximize security and minimize the chances of exposing our credentials.

    • Go to Manage Jenkins > Manage Credentials > Click on Jenkins the highlighted link as shown below


    • Select Add Credentials
    • Choose Secret text in the Kind field
    •  Enter the following below:
    Note: Modify the yellow highlighted text with the right value.
      • Secret = EnterYourSecretKeyHere
      • ID = AWS_SECRET_ACCESS_KEY
      • Description = AWS_SECRET_ACCESS_KEY
    Click OK

    Add another credential and enter the following:

      • Secret = EnterYourAccessIDHere
      • ID = AWS_ACCESS_KEY_ID
      • Description = AWS_ACCESS_KEY_ID

    Click OK





    Configure Terraform

    Go to Manage Jenkins > Global Tool Configuration > It will display Terraform on the list.

    • Enter terraform in the Name field
    • Provide the path /usr/bin/ as shown below




    Setting up CD Pipeline for Terraform

    • Go to Jenkins > New Items. Enter terraform-pipeline in name field > Choose Pipeline > Click OK


    • Select Configure after creation.
    • Go to Build Triggers and enable Trigger builds remotely.
    • Enter tf_token as Authentication Token

     

    Bitbucket Changes
      • Create a new Bitbucket Repo and call it terraform-pipeline
      • Go to Repository Settings after creation and select Webhooks
      • Click Add Webhooks
      • Enter tf_token as the Title
      • Copy and paste the url as shown below
                  http://JENKINS_URL:8080/job/terraform-pipeline//buildWithParameters?token=tf_token
      • Status should be active
      • Click on skip certificate verification
      • triggers --> repository push
    • Go back to Jenkins, select Pipeline Script From SCM
    • Enter credentials for Bitbucket, Leave the Branch as blank, Make sure script path is Jenkinsfile
    • Right click on Pipeline Syntax and open in a new tab. 
    • Choose Checkout from Version Control in the Sample Step field
    • Enter Bitbucket Repository URL and Credentials, leave the branches blank
    • Click GENERATE PIPELINE SCRIPT, copy credentialsId and url (This is required for Jenkinsfile script)



    Create Workspace for Terraform Pipeline
    • Open File Explorer, navigate to Desktop and create a folder cd_pipeline

    • Once folder has been created, open Visual Code Studio and add folder to workspace







    • Open the Terminal
    • Navigate to terraform-pipeline repo in Bitbucket
    • Run the command before cloning repo: git init
    • Clone the repo with SSH or HTTPS
    • Create a new file main.tf and copy the below code in yellow color


















    provider "aws" {

    region = var.region

    version = "~> 2.0"

    }

    resource "aws_instance" "ec2" {

    user_data   = base64encode(file("deploy.sh"))

    ami = "ami-0782e9ee97725263d"   ##Change AMI to meet OS requirement as needed.

    root_block_device {

        volume_type           = "gp2"

        volume_size           = 200

        delete_on_termination = true

        encrypted             = true

      }

    tags = {

    Name = "u2-${var.environment}-${var.application}"

    CreatedBy = var.launched_by

    Application = var.application

    OS = var.os

    Environment = var.environment

    }


    instance_type = var.instance_type


    key_name = "Enter_KEYPAIR_Name_Here"

    vpc_security_group_ids = [aws_security_group.ec2_SecurityGroups.id]

    }


    output "ec2_ip" {

    value = [aws_instance.ec2.*.private_ip]

    }

    output "ec2_ip_public" {

    value = [aws_instance.ec2.*.public_ip]

    }

    output "ec2_name" {

    value = [aws_instance.ec2.*.tags.Name]

    }

    output "ec2_instance_id" {

    value = aws_instance.ec2.*.id


    • Create a new file security.tf and copy the below code in yellow color

    resource "aws_security_group" "ec2_SecurityGroups" {
    name = "u2-${var.environment}-sg-${var.application}"
    description = "EC2 SG"
    ingress {
    from_port = 22
    to_port = 22
    protocol = "tcp"
    cidr_blocks = ["0.0.0.0/0"]
    }
    ingress {
        from_port   = 8081
    to_port     = 8081
    protocol    = "tcp"
    cidr_blocks = ["0.0.0.0/0"]
      }
    ingress {
         from_port   = 8082
    to_port     = 8082
    protocol    = "tcp"
    cidr_blocks = ["0.0.0.0/0"]
       }
      ingress {
    from_port = 80
    to_port = 80
    protocol = "tcp"
    cidr_blocks = ["0.0.0.0/0"]
    }
    #Allow all outbound
    egress {
    from_port = 0
    to_port = 0
    protocol = "-1"
    cidr_blocks = ["0.0.0.0/0"]
    }
    }

    • Create a new file variable.tf and copy the below code in yellow color. 

    variable region {

      type        = string

      default = "us-east-2"

    }

    variable "instance_type" {}

    variable "application" {}

    variable "environment" {}

    ############## tags

    variable os {

      type        = string

      default = "Ubuntu"

    }


    variable launched_by {

      type        = string

      default = "USER"

    }

    ############## end tags


    • Create a new file deploy.sh and copy the below code in yellow color. 
    #!/bin/bash
    set -x

    exec > >(tee /var/log/user-data.log|logger -t user-data -s 2>/dev/console) 2>&1 

    echo ""
    echo "........................................"
    echo "Installation of application"
    echo "........................................"
    echo "Today's date: `date`"
    echo "........................................"
    echo ""
    sudo pip install awscli
    sudo apt-get install -y unzip
    sudo apt update
    sudo apt dist-upgrade
    sudo apt autoremove
    sudo apt update
    sudo apt-get install openjdk-8-jdk openjdk-8-doc
    java -version
    sudo apt install wget software-properties-common
    sudo wget -qO - https://api.bintray.com/orgs/jfrog/keys/gpg/public.key | sudo apt-key add - 
    sudo add-apt-repository "deb [arch=amd64] https://jfrog.bintray.com/artifactory-debs $(lsb_release -cs) main"
    sudo apt update
    sudo apt install jfrog-artifactory-oss
    sudo systemctl stop artifactory.service
    sudo systemctl start artifactory.service
    sudo systemctl enable artifactory.service
    sudo systemctl status artifactory.service
    echo ""
    echo "........................................"
    echo "Installation of application"
    echo "........................................"
    echo "Today's date: `date`"
    echo "........................................"
    echo ""




    • Create a new file Jenkinsfile and copy the below code in yellow color. 



    pipeline {
        agent {
          node {
            label "master"
          } 
        }

        parameters {
            string(name: 'AppName', defaultValue: 'Enter App Name', description: 'Name of application', )
            choice(choices: ['master', 'dev', 'qa', 'prod'], description: 'Select lifecycle to Deploy', name: 'Branch')
            choice(choices: ['t2.micro', 't2.small', 't2.medium'], description: 'Select Instance Size', name: 'InstanceSize')
            booleanParam(name: 'autoApprove', defaultValue: false, description: 'Automatically run apply after generating plan?')
        }


         environment {
            AWS_ACCESS_KEY_ID     = credentials('AWS_ACCESS_KEY_ID')
            AWS_SECRET_ACCESS_KEY = credentials('AWS_SECRET_ACCESS_KEY')
            TF_VAR_instance_type = "${params.InstanceSize}"
            TF_VAR_environment = "${params.Branch}"
            TF_VAR_application = "${params.AppName}"
        }
    // 

        stages {
          stage('checkout') {
            steps {
                echo "Pulling changes from the branch ${params.Branch}"
                git credentialsId: 'paste-credentialsId-here', url: 'paste-url-here' , branch: "${params.Branch}"
            }
          }

            stage('terraform plan') {
                steps {
                    sh "pwd ; terraform init -input=true"
                    sh "terraform plan -input=true -out tfplan"
                    sh 'terraform show -no-color tfplan > tfplan.txt'
    }
                }
            
            stage('terraform apply approval') {
               when {
                   not {
                       equals expected: true, actual: params.autoApprove
                   }
               }

               steps {
                   script {
                        def plan = readFile 'tfplan.txt'
                        input message: "Do you want to apply the plan?",
                        parameters: [text(name: 'Plan', description: 'Please review the plan', defaultValue: plan)]
                   }
               }
           }

            stage('terraform apply') {
                steps {
                    sh "terraform apply -input=true tfplan"
                }
            }
            
            stage('terraform destroy approval') {
                steps {
                    input 'Run terraform destroy?'
                }
            }
            stage('terraform destroy') {
                steps {
                    sh 'terraform destroy -force'
                }
            }
        }

      }

    • Commit and push code changes to Repo with the following:
      • In Vscode, navigate to Source Code Icon on the right tabs on the side
      • Enter commit message
      • Click the + icon to stage changes 

      • Push changes by clicking on the ðŸ”„0 ⬇️ 1 ⬆️ as shown below

    Run Pipeline Job

    • Go to terraform-pipeline on Jenkins and run build 
    Note: The pipeline job will fail the first time to capture the parameters in Jenkinsfile

    • The next time you run a build you should see as shown below





    • Enter Artifactory in the AppName field
    • Select a Branch/Lifecycle to deploy server
    • Choose t2.small or t2.medium for Artifactory server.
    • Go to Console Output to track progress
    Note: You can abort the destroy step and rerun the step by installing Blue Ocean Plugin on Jenkins to delete the resources created.


    Friday, 16 October 2020

    How to use Jenkinsfile in a Pipeline(Pipeline As Code)/Convert your Scripted Pipeline into Jenkinsfile

     Prerequisite: Please make sure you have completed Exercise on :https://violetstreamstechnology.blogspot.com/2020/09/understanding-pipelines-how-to-create.html


    What is a JenkinsFile?

    Jenkins pipelines can be defined using a text file called JenkinsFile. You can implement pipeline as code using JenkinsFile, and this can be defined by using a domain specific language (DSL). With JenkinsFile, you can write the steps needed for running a Jenkins pipeline.

    The benefits of using JenkinsFile are:

    • You can create pipelines automatically for all branches and execute pull requests with just one JenkinsFile.
    • You can review your code on the pipeline
    • You can audit your Jenkins pipeline
    • This is the singular source for your pipeline and can be modified by multiple users.



    This pipeline was defined by the groovy code that was placed in the pipeline section of the Job. 

    You would notice something, Anyone that has access to this job can modify the pipeline as they wish. This can cause lots of problems especially when you have large teams. Developers can manipulate their builds to always pass, No accountability and integrity of process, Zero maintainability, and lots more
    To remediate all this issues Jenkins gives us the ability to use Jenkinsfile so that the code we store in Jenkins can be placed in a Repo and can be version controlled.

    How to convert your existing Jenkins Pipeline to Jenkinsfile

    Step 1:
    Go to your Project in your computer and open git bash.

    Step2: Go into your repo(cd myfirstrepo) then open vscode






    step3: Create a New File in Vscode and name it Jenkinsfile(note: this file has no extensions)


    step 4 Go to your Existing Jenkins Pipeline and copy the pipeline code and paste into the Jenkinsfile



    Your code should look like this:
    node {
     stage ('Checkout')  {

         build job: 'CheckOut' 
     }
    stage ('Build') {
         build job: 'Build' 
        }
    stage ('Code Quality scan') {
          build job: 'Code_Quality' 
            }
            
    stage('Archive Artifacts') {
    build job: 'Archive_Artifacts'
    }
     stage('Publish to Artifactory') {
    build job: 'Publish_To_Artifactory'
    }
    stage ('DEV Approve')
          {
                echo "Taking approval from DEV"
                   
                timeout(time: 7, unit: 'DAYS') {
                input message: 'Do you want to deploy?', submitter: 'admin'
                }
         }
    stage ('DEV Deploy')
             {
                 build job: 'Deploy_To_Container'
              }

    stage ('Slack notification') {
         
    build job: 'Slack_Notification'
        }
      
    }

    Step 5: Save and push your changes to your repo( You can do this with Vscode  also, but I will use git bash)



    Check for your Jenkinsfile on the repo


    Step 6: Go to jenkins and create a new Pipeline Job: Pipeline_From_JenkinsFile
    Select Pipeline From SCM

    Enter credentials for  Bitbucket, Specify the Branch, Make sure script path is Jenkinsfile




    Step 7: Save and Run








    Thursday, 24 September 2020

    Understanding Pipelines- How to create a Jenkins Pipeline from Multiple FreeStyle Jobs

     


    What is Pipeline?

    A pipeline in a Software Engineering team is a set of automated processes that allow Developers and DevOps professionals to reliably and efficiently compile, build and deploy their code to their production compute platforms. There is no hard and fast rule stating what a pipeline should like like and the tools it must utilise, however the most common components of a pipeline are; build automation/continuous integration, test automation, and deployment automation

    In Jenkins, a pipeline is a group of events or jobs which are interlinked with one another in a sequence

    What is a JenkinsFile?

    Jenkins pipelines can be defined using a text file called JenkinsFile. You can implement pipeline as code using JenkinsFile, and this can be defined by using a domain specific language (DSL). With JenkinsFile, you can write the steps needed for running a Jenkins pipeline.

    The benefits of using JenkinsFile are:

    • You can create pipelines automatically for all branches and execute pull requests with just one JenkinsFile.
    • You can review your code on the pipeline
    • You can audit your Jenkins pipeline
    • This is the singular source for your pipeline and can be modified by multiple users.

    JenkinsFile can be defined by either Web UI or with a JenkinsFile.

    Declarative versus Scripted pipeline syntax:

    There are two types of syntax used for defining your JenkinsFile.

    1. Declarative
    2. Scripted

    Declarative:

    Declarative pipeline syntax offers an easy way to create pipelines. It contains a predefined hierarchy to create Jenkins pipelines. It gives you the ability to control all aspects of a pipeline execution in a simple, straight-forward manner.

    Scripted:

    Scripted Jenkins pipeline runs on the Jenkins master with the help of a lightweight executor. It uses very few resources to translate the pipeline into atomic commands. Both declarative and scripted syntax are different from each other and are defined totally differently.

    Why Use Jenkin's Pipeline?

    Jenkins is an open continuous integration server which has the ability to support the automation of software development processes. You can create multiple automation jobs with the help of use cases, and run them as a Jenkins pipeline.

    Here are the reasons why you use should use Jenkins pipeline:

    • Jenkins pipeline is implemented as a code which allows multiple users to edit and execute the pipeline process.
    • Pipelines are robust. So if your server undergoes an unforeseen restart, the pipeline will be automatically resumed.
    • You can pause the pipeline process and make it wait to resume until there is an input from the user.
    • Jenkins Pipelines support big projects. You can run multiple jobs, and even use pipelines in a loop.


    Jenkins Pipeline Concepts

    TermDescription
    PipelineThe pipeline is a set of instructions given in the form of code for continuous delivery and consists of instructions needed for the entire build process. With pipeline, you can build, test, and deliver the application.
    NodeThe machine on which Jenkins runs is called a node. A node block is mainly used in scripted pipeline syntax.
    StageA stage block contains a series of steps in a pipeline. That is, the build, test, and deploy processes all come together in a stage. Generally, a stage block is used to visualize the Jenkins pipeline process.
    StepA step is nothing but a single task that executes a specific process at a defined time. A pipeline involves a series of steps.
    Create a Simple Pipeline 
    Step 1. Create new job for code checkout: You can name it : CheckOut
    Step 2. Click Advanced:
    Click on 
    Use custom workspace 
    Select your workSpace. Enter:tmp



    Step 3. Configure the SCM checkout by adding your Bitbucket Url and credentials



    Click Save.

    Step 4.Create Another Job for the build stage:You can call it: Build


    Step 5: Repeat Step 2
    Step 6: Configure it with maven config as shown below: Then Save
    Step 7. Create new Job to Archive artifacts: Name it:Archive_Artifacts
    Step 8: Repeat Step 2 
    Step 9 : configure as below:

    Step 10: Create a New job: Name it :Publish_To_Artifactory
    Step 11: Repeat Step 2
    Step 12 Configure as below:




    Save
    Step13: Create a New job: Code_Quality
    Step 14: repeat Step 2
    Step 15 Configure as below




    Click Save
    Step 16: Create a new job: Deploy_To_Container
    Step 17: Repeat Step 2
    Step 18: Configure As seen below:


    Click Save
    Step 19: Create A New Job: Slack_Notification
    Step 20: Repeat Step 2
    Step 21 Configure As Seen below
    Step 22: Create A Pipeline Job:MyCI_CB_CD_CT_CN_Pipeline
    Step 23: Click Advanced Project Option
    Enter A Project Name: Continuous Integration-Continuous Build-Continuous Deployment-Continuous Test-Continuous-Notification
    Step 24: Copy and Paste Below code in Pipeline Script Box:


    node {
     stage ('Checkout')  {

         build job: 'CheckOut' 
     }
    stage ('Build') {
         build job: 'Build' 
        }
    stage ('Code Quality scan') {
          build job: 'Code_Quality' 
            }
            
    stage('Archive Artifacts') {
    build job: 'Archive_Artifacts'
    }
     stage('Publish to Artifactory') {
    build job: 'Publish_To_Artifactory'
    }
    stage ('DEV Approve')
          {
                echo "Taking approval from DEV"
                   
                timeout(time: 7, unit: 'DAYS') {
                input message: 'Do you want to deploy?', submitter: 'admin'
                }
         }
    stage ('DEV Deploy')
             {
                 build job: 'Deploy_To_Container'
              }

    stage ('Slack notification') {
         
    build job: 'Slack_Notification'
        }
      
    }



    Save
    Build Now
    Your Pipeline will build like below


    The Above is an example of a scripted pipeline, it follows the structure below:

    / Scripted pipeline
    node {
      stage('Build') {
           echo 'Building....'
      }
      stage('Test') {
          echo 'Building....'
      }
      stage('Deploy') {
          echo 'Deploying....'
      }
    }
    on the other hand, Declarative pipeline follows this:
    // Declarative pipeline
    pipeline {
      agent { label 'slave-node' }
      stages {
        stage('checkout') {
          steps {
            git 'https://bitbucket.org/myrepo''
          }
        }
        stage('build') {
          tools {
            gradle 'Maven3'
          }
          steps {
            sh 'mvn clean test'
          }
        }
      }
    }