Introduction
This article will explain step by step procedure to create Jenkins’s pipeline to backup source code, compare Versions and perform basic validation/checks (Using
cpilint tool) for Integration Flows created in SAP Integration Suite. I have mostly taken (almost all) reference from below two GitHub repo.
CICD-StoreIntegrationArtefact -
sunny.kapoor2 and SAP team
cpilint -
7a519509aed84a2c9e6f627841825b5a
Main
Prerequisites:
- Docker should be installed.(Docker Desktop as well for GUI)
Jenkins Installation (Using Docker)
Jenkins can be installed using simple command from any terminal or cmd.
docker run -p 8080:8080 -p 50000:50000 -v jenkins_home:/var/jenkins_home jenkins/jenkins:lts-jdk11
Jenkins Docker Image Run
After Installation, copy the temp password and go to localhost:8080 in the browser and provide the password and click next. Then create one admin user and press next. In the next page choose appropriate uri and then click save and finish. Do not install all the the plugins and only proceed with "
Select Plugins to install".
GitHub Repo Creation
- Create a new private repository with appropriate name in GitHub.
- Create a "Jenkinsfile" with below code.
pipeline {
agent any
parameters {
string defaultValue: 'internalEventListener', description: 'Iflow Name', name: 'Name', trim: true
}
//Configure the following environment variables before executing the Jenkins Job
environment {
IntegrationFlowID = "${Name}"
CPIHost = "${env.CPI_HOST}"
CPIOAuthHost = "${env.CPI_OAUTH_HOST}"
CPIOAuthCredentials = "${env.CPI_OAUTH_CRED}"
GITRepositoryURL = "${env.GIT_REPOSITORY_URL}"
GITCredentials = "${env.GIT_CRED}"
GITBranch = "${env.GIT_BRANCH_NAME}"
GITFolder = "IntegrationContent/IntegrationArtefacts"
GITComment = "Integration Artefacts update from CICD pipeline"
}
stages {
stage('download integration artefact and store it in GitHub') {
steps {
deleteDir()
script {
//clone repo
checkout([
$class: 'GitSCM',
branches: [[name: env.GITBranch]],
doGenerateSubmoduleConfigurations: false,
extensions: [
[$class: 'RelativeTargetDirectory',relativeTargetDir: "."],
//[$class: 'SparseCheckoutPaths', sparseCheckoutPaths:[[$class:'SparseCheckoutPath', path: env.GITFolder]]]
],
submoduleCfg: [],
userRemoteConfigs: [[
credentialsId: env.GITCredentials,
url: 'https://' + env.GITRepositoryURL
]]
])
//get token
println("Request token");
def token;
try{
def getTokenResp = httpRequest acceptType: 'APPLICATION_JSON',
authentication: env.CPIOAuthCredentials,
contentType: 'APPLICATION_JSON',
httpMode: 'POST',
responseHandle: 'LEAVE_OPEN',
timeout: 30,
url: 'https://' + env.CPIOAuthHost + '/oauth/token?grant_type=client_credentials';
def jsonObjToken = readJSON text: getTokenResp.content
token = "Bearer " + jsonObjToken.access_token
} catch (Exception e) {
error("Requesting the oauth token for Cloud Integration failed:\n${e}")
}
//delete the old flow content so that only the latest content gets stored
dir(env.GITFolder + '/' + env.IntegrationFlowID){
deleteDir();
}
//download and extract artefact from tenant
println("Downloading artefact");
def tempfile = UUID.randomUUID().toString() + ".zip";
def cpiDownloadResponse = httpRequest acceptType: 'APPLICATION_ZIP',
customHeaders: [[maskValue: false, name: 'Authorization', value: token]],
ignoreSslErrors: false,
responseHandle: 'LEAVE_OPEN',
validResponseCodes: '100:399, 404',
timeout: 30,
outputFile: tempfile,
url: 'https://' + env.CPIHost + '/api/v1/IntegrationDesigntimeArtifacts(Id=\''+ env.IntegrationFlowID + '\',Version=\'active\')/$value';
if (cpiDownloadResponse.status == 404){
//invalid Flow ID
error("Received http status code 404. Please check if the Artefact ID that you have provided exists on the tenant.");
}
def disposition = cpiDownloadResponse.headers.toString();
def index=disposition.indexOf('filename')+9;
def lastindex=disposition.indexOf('.zip', index);
def filename=disposition.substring(index + 1, lastindex + 4);
def folder=env.GITFolder + '/' + filename.substring(0, filename.indexOf('.zip'));
def zipfolder=env.GITFolder + '/ZipFiles';
fileOperations([fileUnZipOperation(filePath: tempfile, targetLocation: folder)])
fileOperations([fileRenameOperation(source: tempfile, destination: filename)])
fileOperations([fileCopyOperation(includes: filename, targetLocation: zipfolder)])
env.Filename = filename;
cpiDownloadResponse.close();
//remove the zip
fileOperations([fileDeleteOperation(excludes: '', includes: filename)])
dir(env.GITFolder){
sh 'git add .'
}
println("Store integration artefact in Git")
withCredentials([[$class: 'UsernamePasswordMultiBinding', credentialsId: env.GITCredentials ,usernameVariable: 'GIT_AUTHOR_NAME', passwordVariable: 'GIT_PASSWORD']]) {
sh 'git diff-index --quiet HEAD || git commit -am ' + '\'' + env.GitComment + '\''
sh('git push https://${GIT_PASSWORD}@' + env.GITRepositoryURL + ' HEAD:' + env.GITBranch)
}
}
}
}
stage('Code Analysis') {
steps {
script{
def zipcpilintfile = "cpilint-1.0.4.zip";
def unzipcpilintfile = "cpilint";
fileOperations([fileUnZipOperation(filePath: zipcpilintfile, targetLocation: unzipcpilintfile)])
fileOperations([fileDeleteOperation(excludes: '', includes: zipcpilintfile)])
sh "chmod a+rwx -R $WORKSPACE/cpilint"
sh "$WORKSPACE/cpilint/cpilint-1.0.4/bin/cpilint -rules $WORKSPACE/rules.xml -files $WORKSPACE/IntegrationContent/IntegrationArtefacts/ZipFiles/${env.Filename}"
}
}
}
}
}
Jenkinsfile
- Create a "rules.xml" file. As per the schema cpilint. I have taken one sample xml file. You can modify this as per your requirements.
<?xml version="1.0" encoding="UTF-8"?>
<cpilint>
<rules>
<!-- Require that all iflows have a description. -->
<iflow-description-required/>
<!-- Don't allow the social media receiver adapters. -->
<disallowed-receiver-adapters>
<disallow>facebook</disallow>
<disallow>twitter</disallow>
</disallowed-receiver-adapters>
<!-- Don't allow Router steps configured with both XML and non-XML conditions. -->
<multi-condition-type-routers-not-allowed/>
<!-- Message Mapping and XSLT are the two allowed mapping types. -->
<allowed-mapping-types>
<allow>message-mapping</allow>
<allow>xslt-mapping</allow>
</allowed-mapping-types>
<!-- Make sure that all data store writes are encrypted. -->
<unencrypted-data-store-write-not-allowed/>
</rules>
</cpilint>
- Then download the zip file of cpilint and upload it into the GitHub repo. Now your repo should look like below.
GitHub Repository Files
- Create a PAT (Personal Access Token) in your GitHub account. And save the token.
- Copy GitHub Repo HTTPS url.
HTTP URL
SAP Process Integration API Plan Instance - Service key
Login into BTP cockpit and save the service key.
Service Key
Jenkins Basic Configuration
First create credentials for CPI and GitHub in Manage Jenkins -> Credentials page.
- Use CPI Service key ClientId and ClientSecret as Username and Password and give ID “CPIOAuthCredentials” as below.
CPI Cred
- Create GitHub Credentials using the same method. Use your GitHub User and PAT Token as username and password. Give ID as “GIT_Credentials”.
GitHub Cred
- Then go to Manage Jenkins -> Configure System and configure below environment variables.
Name |
Value |
CPI_HOST |
{{url value from Service Key without https://}} e.g. xxxxxxxxxxx.it-cpi002.cfapps.ap10.hana.ondemand.com |
CPI_OAUTH_CRED |
CPIOAuthCredentials |
CPI_OAUTH_HOST |
{{tokenurl value from Service Key without https://}} e.g. xxxxxxxxxxx.authentication.ap10.hana.ondemand.com |
GIT_BRANCH_NAME |
Main |
GIT_CRED |
GIT_Credentials |
GIT_REPOSITORY_URL |
github.com/Asutosh-Integration/IFlowRepo.git |
Environment Variable
- Run below 2 commands in docker terminal. (from docker desktop app inside the container). Put your own name and email.
git config --global user.name "Your Name"
git config --global user.email "youremail@yourdomain.com"
- Then install below 3 plugins. From your Jenkins dashboard navigate to Manage Jenkins > Manage Plugins and select the Available tab. Locate this plugin by searching below terms and install without restart.
- http_request
- pipeline-utility-steps
- file-operations
Jenkins Pipeline configuration
- Click on New item button from Dashboard and then choose Multibranch pipeline and give appropriate name and hit “OK”.
Multibranch Pipeline Configuration
- Then Scroll down to branch source and select the GitHub credential and give your GitHub repo URL in Repository HTTPS URL as per below config. Then hit save.
Branch Source
Build and Test
- Then click on Build Button in the main branch and then give the IFlowID in the name parameter like below screenshot.
Build with Parameter
- Then build will get completed and show status of each stage.
Build status
- You can now go to your Github repo and see the iFlow files are stored. You can try changing the iflow and save it as a new version and then again try to build the pipeline again for the same iFlow using step 1.
Change in Iflow and Save as a version
- Then after build gets completed, you will be able to see the comparison between both the versions in GitHub.
Diff
- You can also check the basic validation logs which are generated by cpilint application in Jenkins build logs. This will give you full flexibility to configure cpilint and automatic validations via jenkins+cpilint. Thanks Morten for this cool tool.
cpilint logs
Conclusion
This setup is neither tested in any productive scenario, nor any through risk analysis has been done, so I would request to do proper analysis before implementing it. There are a lot of improvement required to have a fully automated Pipeline. We can add more stages to have automatic deployment in another tenant and test with automatic test scripts.