Technology Blogs by Members
Explore a vibrant mix of technical expertise, industry insights, and tech buzz in member blogs covering SAP products, technology, and events. Get in the mix!
Showing results for 
Search instead for 
Did you mean: 
Dear community,


Some days ago, I started to work on a PoC about the CI/CD Azure pipeline for Integration contented. The very initial steps were to know how works the Piper commands.

For more information about project “Piper” and the other CI and CD offerings by SAP, see Overview of SAP Offerings for CI and CD. And more about CI/CD for SAP Integration Suite, see CI/CD for SAP Integration Suite? Here you go!

And of course, the blog post that has inspired me to do this blog post, see Working with Integration Suite Piper commands.


Let's follow the below steps:

  1. Creating Azure DevOps Project

  2. Configure Cloud Integration API service key in Azure DevOps as security credentials

  3. Creating a new pipeline project in Azure DevOps

  4. Conclusion


Creating Azure DevOps Project

Let's create the project "SCP-Pipeline" and a new repo "Garage.SAPCI.PoC", and finally clone in VS Code.



Configure Cloud Integration API service key in Azure DevOps as security credentials

Get the service key, explained in step 2 in the blog post.

Below a two steps to perform with the service key file.

  • Convert from service key payload from JSON to JSON string. I found useful this online tool to perform this task.

  • Beautify the JSON String file by changing the double quote(") for an apostrophe (') at the beginning and the end and finally clean each backslash (\). The beautified JSON String file show looks like the below one. With this small change, Piper will send a complete client secret in the HTTP request and not just the initial part before the character $ that could cause an HTTP 401 error.



Create a variable "CREDENTIALS" and add as value the modified JSON String payload.


Creating a new pipeline project in Azure DevOps

The important part is to download piper from github. Then you can use the piper executable in the jobs. In the below example the first job gets piper and puts the executable into a cache. The next job gets piper from the cache and does some action for cloud integration content.


a.- Get piper and put the executable into a cache.
# Starter pipeline
# Start with a minimal pipeline that you can customize to build and deploy your code.
# Add steps that build, run tests, deploy, and more:

- main

- job: downloadPiper
vmImage: ubuntu-latest
- checkout: none
- task: Cache@2
key: piper-go-official
path: bin
cacheHitVar: FOUND_PIPER
displayName: Cache piper go binary
- script: |
mkdir -p bin
curl -L --output bin/piper
chmod +x bin/piper
condition: ne(variables.FOUND_PIPER, 'true')
displayName: 'Download Piper'
- script: bin/piper version
displayName: 'Piper Version'


b.- Piper command to deploy an iFlow
  - job: deployiFlow
dependsOn: downloadPiper
- group: development
vmImage: 'ubuntu-latest'
- task: Cache@2
key: piper-go-official
path: bin
displayName: deploy iflow
- script: |
bin/piper integrationArtifactDeploy --verbose --apiServiceKey $(CREDENTIALS) --integrationFlowId "TestCICD"

Running pipeline project and verifying results

Below the results of the pipeline execution.

The deployment status also in SAP CI Web UI->Manage Integration Content

You can combine these piper commands and build a complex scenario.



Finally, with the above instructions, we can perform the Piper commands using a Microsoft Azure DevOps pipeline. Based on this one, we can create more complex scenarios.

I hope you find useful this blog post. You are very welcome to provide feedback or thoughts in the comment section. And thanksmayurbelur.mohan for supporting me in this journey!

Related to this topic you can also find Q&A and post questions by the tags DevOps, SAP Integration Suite, SAP BTP, Cloud Foundry environment.
Labels in this area