Technology Blogs by SAP
Learn how to extend and personalize SAP applications. Follow the SAP technology blog for insights into SAP BTP, ABAP, SAP Analytics Cloud, SAP HANA, and more.
cancel
Showing results for 
Search instead for 
Did you mean: 
mayurbelur_mohan
Product and Topic Expert
Product and Topic Expert
9,739
SAP implements tooling for continuous delivery in opensource project "Piper". The goal of project "Piper" is to substantially ease setting up continuous delivery in your project using SAP technologies.

For more information about project “Piper” and the other CI and CD offerings by SAP, see Overview of SAP Offerings for CI and CD.

Configuring project piper in your Jenkins server is explained in this blog.

SAP Integration Suite has contributed not only many Jenkins pipelines, but also many piper library steps, which helps you to create your own CI/CD pipeline, automating various tasks.

For example, in the case of  SAP Integration Suite capability “Cloud Integration”, we can automate below scenario

  1. Update the integration flow design time configuration parameter.

  2. Deploy an integration flow.

  3. Get the service endpoint of the deployed integration flow.

  4. Invoke the service endpoint with the HTTP request.

  5. Get message processing log (MPL) status of the integration flow.

  6. If the MPL status is completed in previous step, download integration flow artifact from design time.

  7. Store integration flow artifact in the GitHub repository.


Let’s understand all the Piper commands provided for SAP Integration Suite capabilities, mainly Cloud Integration and API Management.

SAP Cloud Integration Piper Steps































































Piper Step Name Description Documentation  Link Pipeline  Example

integrationArtifactDeploy


Deploy an integration flow in to the SAP Cloud Integration runtime


Link


Link


integrationArtifactDownload


Download integration flow runtime artefact


Link


Link


integrationArtifactGetMplStatus


Get the MPL status of an integration flow


Link


Link


integrationArtifactGetServiceEndpoint


Get an deployed integration flow service endpoint


Link


Link


integrationArtifactResource


Add, Delete or Update an resource file of integration flow design time artifact


Link


Link


integrationArtifactUnDeploy


Undeploy a integration flow


Link


Link


integrationArtifactUpdateConfiguration


Update integration flow configuration parameter


Link


Link


integrationArtifactUpload


Upload or Update an integration flow design time artifact


Link


Link


integrationArtifactTransport


Integration Package transport using the SAP Content Agent Service


Link


Link


SAP API Management piper steps



























Piper Step Name Description Documentation  Link Pipeline  Example

apiProxyDownload


Download a specific API Proxy from the API Portal


Link


Link


apiKeyValueMapDownload


Download a specific Key Value Map from the API Portal


Link


Link


apiProxyUpload


Upload an api proxy artifact in to the API Portal


Link


Link


Going forward, more piper steps would be contributed for both Cloud Integration and API Management.

Now let’s take an example of consuming Cloud Integration piper command "integrationArtifactDeploy" in the Jenkins server.

This involves below steps:

  1. Creating Jenkins pipeline project in GitHub which consumes cloud integration piper command

  2. Configure Cloud Integration API service key in the Jenkins server as security credentials

  3. Configure the piper library in the Global pipeline libraries

  4. Creating new pipeline project in Jenkins based on pipeline script from SCM approach.

  5. Running pipeline project and verifying results


 

Creating Jenkins pipeline project in GitHub which consumes Cloud Integration piper command

Let’s consume integrationArtifactDeploy piper command which is responsible for deploying an integration flow in to the Cloud Integration runtime.

First step is to create GitHub repository as shown below


Repository has directory called .pipeline and file Jenkinsfile. Jenkins file has the groovy script code, which has logic to invoke the integrationArtifactDeploy piper command


Here “integrationArtifactDeploy script: this” line executes the cloud integration piper command

.pipeline has the config.yaml file which provide input arguments for the integrationArtifactDeploy piper command


Here we pass the integration flow ID which has to be deployed and the API service key details which is configured in the Jenkins system as security credential.

Configure Cloud integration API service key in the Jenkins server as security credentials

API service key need to be configured as Jenkins credentials so that all pipeline projects can use it.

Select the Manage Jenkins configuration option in the Jenkins server home page.


 

Select the Manage Credentials option


 

Select the Global credentials


Copy the service key JSON text from the SAP BTP account cockpit, this can be found in below location.

SAP BTP Cockpit Subaccount home page -> instances and subscriptions -> instance name (API plan)

->Service keys ( view and copy the json text)


Create new credentials of type secret text under add credentials option, paste the service key text under secret text input box and save it.


 

The same ID used here need to be passed as input for cpiApiServiceKeyCredentialsId configuration parameter in the config.yml file as shown below


Configure the piper library in the Global pipeline libraries

Provide the piper library runtime configuration in the Jenkins configuration -> Global pipeline libraries section.


Creating new pipeline project in Jenkins based on pipeline script from SCM approach.

Select new Item -> pipeline project and provide the project name


 

Configure the repository URL, branch to pull and script file name as shown below


 

Click on save/apply to save the project. This step will create Jenkins pipeline project to pull the SCM repository configured in the GitHub and execute the Jenkins file which has logic to execute the SAP cloud integration piper command.

Running pipeline project and verifying results

This step involves build and running the Jenkins pipeline project and verify the SAP cloud integration piper command execution results.

Click on build now and see the latest build results


If you select the specific build and check the console output, you can validate weather piper command successfully executed or not!


You can combine these piper commands and build a complex scenario, where you can manage the end to end lifecycle of an integration flow artefact for CI/CD, starting from configure to deploy, check execution status, download and store in git etc.

If you want to build your own custom piper command for Integration Suite, you can contribute to opensource SAP Project Piper (https://www.project-piper.io/) and follow the developer guide for building your own custom shared library steps.

Below is the example hello World piper command sample, which you can refer to build it in your own forked piper GitHub repository from piper master repository ( i.e. https://github.com/SAP/jenkins-library)  and test in Jenkins server.

Hello world example step development commits, which show the workflow: https://github.com/marcusholl/jenkins-library/commits/helloWorld

Pipeline to build and run the code in a Jenkins: https://github.com/marcusholl/helloWorld/blob/main/Jenkinsfile#L10

Example of a groovy wrapper with username, Password credentials: https://github.com/SAP/jenkins-library/blob/master/vars/cloudFoundryCreateSpace.groovy

 

 
44 Comments
Clen
Participant
0 Kudos
Great Post @Mayur!

Any idea how I can implement the Piper commands for SAP Integration Suite-Cloud Integration and Azure DevOps?

 
mayurbelur_mohan
Product and Topic Expert
Product and Topic Expert

Thank you @Alexander Clen Riva

it seems 2 options

Option 1

using piper docker image and running integration Suite piper commands as explained in the blog for other commands https://blogs.sap.com/2019/10/24/how-to-use-project-piper-docker-images-for-cicd-with-azure-devops/

Option 2

wrap a Jenkins CI job inside an Azure pipeline. In this approach, a build definition will be configured in Azure Pipelines to use the Jenkins tasks to invoke a CI job in Jenkins, download and publish the artifacts produced by Jenkins

Best Regards,
Mayur

mayurbelur_mohan
Product and Topic Expert
Product and Topic Expert
important part is to download piper from github.
Then you can use the piper executable in the jobs.
In the example the first job gets piper and put the executable into a cache.
The second job gets piper from the cache and does some action.

Here is an example (snippet):
jobs:
- job: downloadPiper
pool:
vmImage: 'ubuntu-latest'
container:
image: <your docker image>
options: -u 0
steps:
- checkout: none
- task: Cache@2
inputs:
key: piper-go-official
path: bin
cacheHitVar: FOUND_PIPER
displayName: Cache piper go binary
- script: |
mkdir -p bin
curl -L --output bin/piper https://github.com/SAP/jenkins-library/releases/download/v1.190.0/piper
chmod +x bin/piper
condition: ne(variables.FOUND_PIPER, 'true')
displayName: 'Download Piper'
- script: bin/piper version
displayName: 'Piper Version'
- job: piperDoesSomeAction
dependsOn: downloadPiper
pool:
vmImage: 'ubuntu-latest'
container:
image: <your docker image>
options: -u 0
steps:
- task: Cache@2
inputs:
key: piper-go-official
path: bin
displayName: resolve piper go binary from cache
- script: |
bin/piper version
displayName: 'some action'
Clen
Participant
Thank you for your answer mayurbelur.mohan

I am working on the option 1. I guess I can use the docker image mentioned in the blog https://blogs.sap.com/2019/10/24/how-to-use-project-piper-docker-images-for-cicd-with-azure-devops/.
mayurbelur_mohan
Product and Topic Expert
Product and Topic Expert
0 Kudos
That's great, let me know how it goes and also any enhancements you need to Integration Suite piper commands
Clen
Participant
0 Kudos
Hi mayurbelur.mohan

I am using the image "https://github.com/SAP/devops-docker-images" but I am not sure if it is the right one for Cloud Integration artifacts.

I have the below pipeline.
# Starter pipeline
# Start with a minimal pipeline that you can customize to build and deploy your code.
# Add steps that build, run tests, deploy, and more:
# https://aka.ms/yaml

trigger:
- main

jobs:
- job: downloadPiper
pool:
vmImage: ubuntu-latest
container:
image: https://github.com/SAP/devops-docker-images
options: -u 0
steps:
- checkout: none
- task: Cache@2
inputs:
key: piper-go-official
path: bin
cacheHitVar: FOUND_PIPER
displayName: Cache piper go binary
- script: |
mkdir -p bin
curl -L --output bin/piper https://github.com/SAP/jenkins-library/releases/download/v1.190.0/piper
chmod +x bin/piper
condition: ne(variables.FOUND_PIPER, 'true')
displayName: 'Download Piper'
- script: bin/piper version
displayName: 'Piper Version'

- job: piperDoesSomeAction
dependsOn: downloadPiper
pool:
vmImage: 'ubuntu-latest'
container:
image: https://github.com/SAP/devops-docker-images
options: -u 0
steps:
- task: Cache@2
inputs:
key: piper-go-official
path: bin
displayName: resolve piper go binary from cache
- script: |
bin/piper version
displayName: 'some action'

- job: deployiFlow
dependsOn: downloadPiper
variables:
- group: development
pool:
vmImage: 'ubuntu-latest'
container:
image: https://github.com/SAP/devops-docker-images
options: -u 0
steps:
- task: Cache@2
inputs:
key: piper-go-official
path: bin
displayName: deploy iflow
- script: |
bin/piper integrationArtifactDeploy --apiServiceKey $(CREDENTIALS) --integrationFlowId $(IFLOWID)

But I am getting the below error. Any idea? 🙂
Starting: CmdLine
==============================================================================
Task : Command line
Description : Run a command line script using Bash on Linux and macOS and cmd.exe on Windows
Version : 2.198.0
Author : Microsoft Corporation
Help : https://docs.microsoft.com/azure/devops/pipelines/tasks/utility/command-line
==============================================================================
Generating script.
Script contents:
bin/piper integrationArtifactDeploy --apiServiceKey { "oauth": { "clientid": "xx-xxxxxxxxx-xxx-xxxxxxxx|it!xxx", "clientsecret": "xxxxx-xxx-xxxxxxxxxxxxxxxxxxxxxxxxxx=", "url": "https://xxxxxxtrial.it-cpitrial05.cfapps.us10-001.hana.ondemand.com", "tokenurl": "https://xxxxxxtrial.authentication.us10.hana.ondemand.com/oauth/token" } } --integrationFlowId CostCenter_Replication_from_X_to_Y
========================== Starting Command Output ===========================
/usr/bin/bash --noprofile --norc /home/vsts/work/_temp/f57a3325-1a48-416a-bcc0-887986890779.sh
info integrationArtifactDeploy - Using stageName '__default' from env variable
info integrationArtifactDeploy - Project config: NONE ('.pipeline/config.yml' does not exist)
info integrationArtifactDeploy - fatal error: errorDetails****"category":"undefined","correlationId":"https://dev.azure.com/xxxxxxx/SCP-Pipeline/_build/results?buildId=38","error":"error unmarshalling serviceKey: unexpected end of JSON input","library":"SAP/jenkins-library","message":"step execution failed","result":"failure","stepName":"integrationArtifactDeploy"}
fatal integrationArtifactDeploy - step execution failed - error unmarshalling serviceKey: unexpected end of JSON input
##[error]Bash exited with code '1'.
Finishing: CmdLine
mayurbelur_mohan
Product and Topic Expert
Product and Topic Expert
This https://github.com/SAP/devops-docker-images is a github repository where we describe which images we provide and where to find them. As you are calling just APIs from Piper, please remove the images and run directly on the vm.
vijay_kumar133
Active Participant
0 Kudos
Hi Alex,

Did you able to resolve and make it work end to end with azure dev ops for integration artifacts .

Regards

Vijay
mayurbelur_mohan
Product and Topic Expert
Product and Topic Expert
0 Kudos
is the issue resolved Alex.

Best Regards,

Mayur
Clen
Participant
0 Kudos
Hi mayurbelur.mohan the below issue is still a blocker.
info  integrationArtifactDeploy - Project config: NONE ('.pipeline/config.yml' does not exist)
info integrationArtifactDeploy - fatal error: errorDetails****"category":"undefined","correlationId":"https://dev.azure.com/xxxxxxx/SCP-Pipeline/_build/results?buildId=38","error":"error unmarshalling serviceKey: unexpected end of JSON input","library":"SAP/jenkins-library","message":"step execution failed","result":"failure","stepName":"integrationArtifactDeploy"}
fatal integrationArtifactDeploy - step execution failed - error unmarshalling serviceKey: unexpected end of JSON input
##[error]Bash exited with code '1'.

 

not sure why is asking for .pipeline/config.yml. I am working with Azure DevOps - azure-pipelines.yml
mayurbelur_mohan
Product and Topic Expert
Product and Topic Expert
0 Kudos

issue is the service key json document is invalid. When you uses it in a secret environment variable, I suggest using the json document as a string without CR or CR/LF.

 

mayurbelur_mohan
Product and Topic Expert
Product and Topic Expert
0 Kudos
any luck alex? is issue solved?
Clen
Participant
0 Kudos
Hi Mayur,

Now the error is different. It looks like that something is missing in the conversion from json file to string.
I verified from Postman that the Client Id and Client Secret are working and also the permission to deploy iflow using the API are also validated. I mean from Postman a can deploy an iFlow.

Below the logs(I changes a little the client id and client secret values but keeping the format)
Starting: CmdLine
==============================================================================
Task : Command line
Description : Run a command line script using Bash on Linux and macOS and cmd.exe on Windows
Version : 2.198.0
Author : Microsoft Corporation
Help : https://docs.microsoft.com/azure/devops/pipelines/tasks/utility/command-line
==============================================================================
Generating script.
Script contents:
bin/piper integrationArtifactDeploy --verbose --apiServiceKey "{ \"oauth\": { \"clientid\": \"sb-aaaaaaa-85ad-zzzz-yyyy-xxxxxxxxxxxx|it!b99999\", \"clientsecret\": \"rrrrr-zzzzzzz-89be-pppppppp$Mi_uuuuuuuuuuuuuu-ooooooo=\", \"tokenurl\": \"https://16072687trial.authentication.us10.hana.ondemand.com/oauth/token\", \"url\": \"https://16072687trial.it-cpitrial05.cfapps.us10-001.hana.ondemand.com\" }}" integrationFlowId 'TestCICD'
========================== Starting Command Output ===========================
/usr/bin/bash --noprofile --norc /home/vsts/work/_temp/dabe8869-117f-4fb4-8550-b4acecf71ec4.sh
info integrationArtifactDeploy - Using stageName '__default' from env variable
debug integrationArtifactDeploy - Reading file from disk: .pipeline/commonPipelineEnvironment/custom/gcsFolderPath.json
info integrationArtifactDeploy - Project config: NONE ('.pipeline/config.yml' does not exist)
debug integrationArtifactDeploy - Skipping fetching secrets from Vault since it is not configured
info integrationArtifactDeploy - CPI serviceKey read successfully
debug integrationArtifactDeploy - Using Basic Authentication ****/****
debug integrationArtifactDeploy - no trusted certs found / using default transport / insecure skip set to true / : continuing with existing tls config
debug integrationArtifactDeploy - Transport timeout: 3m0s, max request duration: 0s
info integrationArtifactDeploy - [DEBUG] POST https://16072687trial.authentication.us10.hana.ondemand.com/oauth/token?grant_type=client_credential...
debug integrationArtifactDeploy - --------------------------------
debug integrationArtifactDeploy - --> POST request to https://16072687trial.authentication.us10.hana.ondemand.com/oauth/token?grant_type=client_credential...
debug integrationArtifactDeploy - headers: map[Accept:[application/json] Authorization:[<set>]]
debug integrationArtifactDeploy - cookies:
debug integrationArtifactDeploy - --------------------------------
debug integrationArtifactDeploy - <-- response 401 https://16072687trial.authentication.us10.hana.ondemand.com/oauth/token?grant_type=client_credential... (441.58ms)
debug integrationArtifactDeploy - --------------------------------
mayurbelur_mohan
Product and Topic Expert
Product and Topic Expert
0 Kudos

there are 2 service key one for API plan, another for integration-flow plan. which one have you used?. you have to use API plan service key.

Clen
Participant
0 Kudos
Hi Mayur,
I am using the API plan service key and I verified from Postman and it is working fine to deploy the iflows.


 
Clen
Participant
0 Kudos

Hi mayurbelur.mohan The root cause is that the Client Secret contains the character $. This character is cutting the Client Secret value, and it is always returning 401 as the response code.
I encoded the Client Secret but it is still not working.

mayurbelur_mohan
Product and Topic Expert
Product and Topic Expert
0 Kudos

please schedule a call

Clen
Participant
0 Kudos
Sure, I sent you the meeting invite.
jerryjanda
Community Manager
Community Manager
Hi, clen.riva and mayurbelur.mohan :

I'm glad that you were able to connect, but please note that sharing personal email addresses publicly violates our rules of engagement (https://community.sap.com/resources/rules-of-engagement). I've removed that from the comment. When members want to connect, we recommend following a member (via his or her profile), then leaving a comment asking that member to follow back. When members follow each other, they then have the ability to connect via the community's private messages.

Kind regards,

--Jerry

Moderation Lead
ghaetrer
Explorer
0 Kudos

When is SAP expecting to support management of new artifact types, such as SOAP and REST API, via the OData API so it can be leveraged with Project Piper?

Is there any expectation that all assets, such as Script Collections and Value Mappings, will also be supportable via the same processes used for Integration Flows?

Additionally, the https://api.sap.com/api/IntegrationContent/resource with the "New API Hub" option selected does not support environment configurations for Cloud Foundry. 

former_member791832
Participant
0 Kudos
Hi Mayur,

Thank you for sharing an insightful and well-explained blog.

 

For a project requirement, we were trying to use GitLab instead of Jenkins and GitHub. Where we are trying to establish the source code and CI/CD.

Can you please guide us on how to go about it and if it is possible in the first place? We tried using a similar piper code but it doesn't build.

If not, what would be the next option, perhaps, source code in GitLab and CI/CD in BTP.

Please guide us on how to go about it.

 

 

Thanks,

Ramya

 

 
mayurbelur_mohan
Product and Topic Expert
Product and Topic Expert

Please refer to the youtube video https://www.youtube.com/watch?v=jUiKi6FWYrg which explains how to create basic CI/CD pipeline in GitLab.

After successfully completing it, replace that YAML and build  YAML like below

 

Take the credentials from ServiceKey JSON of your process integration instance in subaccount.

Please note that:

Gitlab pipelines are based on YAML, so does the Azure Devops pipeline. I suggest to prepare Gitlab pipeline YAML looking in to working Azure Devops pipeline YAML showcased here Working with Integration Suite Piper commands and Microsoft Azure DevOps | SAP Blogs

There is one example of interoperability between both YAMLs explained here continuous integration - Gitlab yaml to azure pipelines yaml file - Stack Overflow

former_member791832
Participant
0 Kudos
Hi Mayur,

Thank you for replying back. I did try this and got the below error while building:


I have followed the exact procedure. Please help me understand where I could have made a blunder.

 

Also, additionally please confirm if we can automate the deployment of SAP CPI Iflows using GitLab alone.

 

Thanks,

Ramya

 
mayurbelur_mohan
Product and Topic Expert
Product and Topic Expert
0 Kudos
may be curl command didn't work and not downloaded the piper binary. if the curl command doesnt exist, it need to be download using native OS commands

please give the gitlab logging for the JOB you have executed. we will know the status of each command execution under the script.

may be you can check in below order

  1. does curl command exist in OS

  2. weather curl downloaded the piper binary or not

  3. weather bin/piper directory created properly in the filesytem

  4. output of piper command execution


 
former_member791832
Participant
0 Kudos
Hi Mayur,

 

Attaching the job screenshots. While the job for download piper shows successful, I can't see the directory in the filesystem.

Can you please help me out there?

Attaching the gitlab logging for the job .


 

 


 


 


 


 
mayurbelur_mohan
Product and Topic Expert
Product and Topic Expert
0 Kudos
does both JOBS uses separate VMs when getting executed?. then first job output of downloaded binary may not be available for second job!.

can you club both in single job and print the downloaded binary signature using echo or ls command before executing any command from piper binary?
former_member791832
Participant
0 Kudos

Hi Mayur,

Both JOBS use a single VM. I clubbed both in a single job and ended up with the following :

 

 

 

Thanks,

Ramya

mayurbelur_mohan
Product and Topic Expert
Product and Topic Expert
0 Kudos
Now i see piper binary is executed and made POST request as well. only issue i am seeing here is , Iflow Is not found which is being used for deploy. that's what error at line no 65 specifies. i.e. 404 error code means resource(iflow) not found.

please check

  1. iflow ID is valid

  2. try to deploy it using postman once to crosscheck you can use that iflow ID

  3. once API works in postman, provide same iflow ID to piper command


 

 

 
former_member791832
Participant
0 Kudos
Hi Mayur,

The flow id is correct. Attaching the screenshot of how I am getting the Iflow ID.


 

Also, I tried different iflow ids and yet I get a 404 error even in postman:


 

The actual endpoint from CPI looks like this, maybe that is why it is unable to find the right path.



 

 

Please let me know if I am configuring the flow id wrong or where else I could be at fault.

 

 

Thanks ,

Ramya
mayurbelur_mohan
Product and Topic Expert
Product and Topic Expert

please take the CPI host URL from the service key created for Process integration runtime instance API Plan. please refer to script https://github.com/SAP/apibusinesshub-integration-recipes/blob/master/Recipes/for/CICD-DeployIntegra... on how to use iFlowdeploy API

former_member791832
Participant
0 Kudos
Hi Mayur,

 

Thank you so much for all the guidance. I am able to do it successfully now.

 

While I can deploy integration artefact using piper commands in GitLab, I was wondering if I could implement this scenario:

Complete syncing of SAP CPI with GitLab, where any changes in the integration artefact should be making changes in GitLab as well.

Will that be possible?

 

Any leads would be appreciated.

 

Thanks,

Ramya

 

 
mayurbelur_mohan
Product and Topic Expert
Product and Topic Expert
is the issue resolved ramya?
former_member791832
Participant
0 Kudos
Hi Mayur,

The previous issue was resolved and I was trying version management of Iflows in GitLab/ GitHub using AzureDevops

Can we do a version comparison and only store the latest version of the artifact in the repository on Gitlab/GitHub using Azure DevOps for CI/CD using the Piper Commands?

 

 

Thanks,

Ramya
0 Kudos
I get the below error while trying to use CAS and TMS to deploy Integration Artificats..

+ ./piper integrationArtifactTransport info integrationArtifactTransport - Using stageName 'integrationArtifactTransport Command' from env variable info integrationArtifactTransport - Project config: '.pipeline/config.yml' info integrationArtifactTransport - CPI serviceKey read successfully info integrationArtifactTransport - fatal error: errorDetails{"category":"undefined","correlationId":"http://35.240.x.x:8080/job/MyfirstTransport/6/","error":"failed to fetch Bearer Token: parse \":///oauth/token?grant_type=client_credentials\u0026response_type=token\": missing protocol scheme","library":"SAP/jenkins-library","message":"step execution failed","result":"failure","stepName":"integrationArtifactTransport","time":"2023-02-12T15:51:27.393681667Z"} fatal integrationArtifactTransport - step execution failed - failed to fetch Bearer Token: parse ":///oauth/token?grant_type=client_credentials&response_type=token": missing protocol scheme info integrationArtifactTransport - Step telemetry data:{"StepStartTime":"2023-02-12 15:51:27.390897599 +0000 UTC","PipelineURLHash":"a356f5cb09f600cb0f7b0fb2a239b558680599c9","BuildURLHash":"e3e09af87e68ce44b9b48cbf6c0a37f9a098b57a","StageName":"integrationArtifactTransport Command","StepName":"integrationArtifactTransport","ErrorCode":"1","StepDuration":"3","ErrorCategory":"undefined","CorrelationID":"http://35.240.x.x:8080//job/MyfirstTransport/6/","PiperCommitHash":"54d0c68feb5e9fc92b5567c3701ccecf302fbe49","ErrorDetail":{"category":"undefined","correlationId":"http://35.240.137.76:8080/job/MyfirstTransport/6/","error":"failed to fetch Bearer Token: parse \":///oauth/token?grant_type=client_credentials\u0026response_type=token\": missing protocol scheme","library":"SAP/jenkins-library","message":"step execution failed","result":"failure","stepName":"integrationArtifactTransport","time":"2023-02-12T15:51:27.393681667Z"}}

 

I have added the Service Key JSON text of CAS Service Instance with the Alias caspocServiceKeyCred as described in integrationArtifactTransport/config.yml at main · mayurmohan/integrationArtifactTransport (github.com)

 

Regards,

Senthil
mayurbelur_mohan
Product and Topic Expert
Product and Topic Expert
0 Kudos
the serviceKey you format like below before storing as secret text in the Jenkins Service

 

{
"oauth": {
"createdate": "2022-09-12T10:51:55.474Z",
"clientid": "sb-64f36979-xxx-468b-b36e-a5a9e19c6a9b!b1509|it!b68",
"url": "https://content-agent-engine.cfapps.xxx.hana.ondemand.com",
"clientsecret": "af770xx-ab05-4868-a52c-9b87f8aaafd5$cLPOilWdzF2hpNt_9Lq140AH4v6c35RItT5zr9YeD6I=",
"tokenurl": "https://xxxxx.authentication.xxx.hana.ondemand.com/oauth/token"
}
}

 

this you can get from CAS instance application service plan which has more fields, format JSON like mentioned above and store it as secret text. from the error its clear that piper step not able to get URL details properly
0 Kudos
Hi Mayer,

Thanks ! I got it to working now. I am now able to run a pipeline job to trigger a transport request that lands up in SAP TMS !

One question though, what is the easiest way to get "resourceID" as I don't see it in clear text anywhere in the package.

 






























steps:
integrationArtifactTransport:
casApiServiceKeyCredentialsId: caspocServiceKeyCred
integrationPackageId: TestTransport
resourceID: d9c3fe08ceeb47a2991e53049f2ed766
name: TestTransport
version: 1.0

 

Regards,

Senthil
mayurbelur_mohan
Product and Topic Expert
Product and Topic Expert
0 Kudos
use the odata API, mainly /api/v1/IntegrationPackages?$format=json
former_member842591
Discoverer
0 Kudos
Hi Mayur,

 

I have followed your blog and SAP TechEd 2022 session IN180 to configured the steps to deploy the integration flow using the piper commands via Jenkins. In the session I couldn't find the configuration regarding the shared library path details. I followed the share library path provided in the blog and ran the build Now option. I received the below error message. Could you please help me to configure the shared library path.

 
Fetching upstream changes from https://github.com/SAP/jenkins-library.git
> git.exe --version # timeout=10
> git --version # 'git version 2.39.2.windows.1'
> git.exe fetch --no-tags --force --progress -- https://github.com/SAP/jenkins-library.git +refs/heads/*:refs/remotes/origin/* # timeout=10
Checking out Revision 906512a162e00bf283a2efe897154e8b55b452d3 (master)
> git.exe config core.sparsecheckout # timeout=10
> git.exe checkout -f 906512a162e00bf283a2efe897154e8b55b452d3 # timeout=10
ERROR: Checkout failed
hudson.plugins.git.GitException: Command "git.exe checkout -f 906512a162e00bf283a2efe897154e8b55b452d3" returned status code 1:
stdout:
stderr: error: unable to create file integration/testdata/TestMavenIntegration/cloud-sdk-spring-archetype/application/src/main/java/mydemo/controllers/HelloWorldController.java: Filename too long
error: unable to create file integration/testdata/TestMavenIntegration/cloud-sdk-spring-archetype/application/src/main/java/mydemo/models/HelloWorldResponse.java: Filename too long
error: unable to create file integration/testdata/TestMavenIntegration/cloud-sdk-spring-archetype/integration-tests/src/test/java/mydemo/HelloWorldControllerTest.java: Filename too long
error: unable to create file integration/testdata/TestMavenIntegration/cloud-sdk-tomee-archetype/integration-tests/src/test/java/mydemo/HelloWorldServletTest.java: Filename too long
HEAD is now at 906512a1 feat(cnbbuild): allow bindings to have multiple keys (#4231)

at org.jenkinsci.plugins.gitclient.CliGitAPIImpl.launchCommandIn(CliGitAPIImpl.java:2734)
at org.jenkinsci.plugins.gitclient.CliGitAPIImpl$9.execute(CliGitAPIImpl.java:3056)
Caused: hudson.plugins.git.GitException: Could not checkout 906512a162e00bf283a2efe897154e8b55b452d3
at org.jenkinsci.plugins.gitclient.CliGitAPIImpl$9.execute(CliGitAPIImpl.java:3080)
at hudson.plugins.git.GitSCM.checkout(GitSCM.java:1359)
at org.jenkinsci.plugins.workflow.steps.scm.SCMStep.checkout(SCMStep.java:129)
at org.jenkinsci.plugins.workflow.libs.SCMSourceRetriever.lambda$doRetrieve$1(SCMSourceRetriever.java:200)
at org.jenkinsci.plugins.workflow.libs.SCMSourceRetriever.retrySCMOperation(SCMSourceRetriever.java:148)
at org.jenkinsci.plugins.workflow.libs.SCMSourceRetriever.doRetrieve(SCMSourceRetriever.java:199)
at org.jenkinsci.plugins.workflow.libs.SCMSourceRetriever.retrieve(SCMSourceRetriever.java:137)
at org.jenkinsci.plugins.workflow.libs.LibraryAdder.retrieve(LibraryAdder.java:260)
at org.jenkinsci.plugins.workflow.libs.LibraryAdder.add(LibraryAdder.java:150)
at org.jenkinsci.plugins.workflow.libs.LibraryDecorator$1.call(LibraryDecorator.java:125)
at org.codehaus.groovy.control.CompilationUnit.applyToPrimaryClassNodes(CompilationUnit.java:1087)
at org.codehaus.groovy.control.CompilationUnit.doPhaseOperation(CompilationUnit.java:624)
at org.codehaus.groovy.control.CompilationUnit.processPhaseOperations(CompilationUnit.java:602)
at org.codehaus.groovy.control.CompilationUnit.compile(CompilationUnit.java:579)
at groovy.lang.GroovyClassLoader.doParseClass(GroovyClassLoader.java:323)
at groovy.lang.GroovyClassLoader.parseClass(GroovyClassLoader.java:293)
at org.jenkinsci.plugins.scriptsecurity.sandbox.groovy.GroovySandbox$Scope.parse(GroovySandbox.java:163)
at org.jenkinsci.plugins.workflow.cps.CpsGroovyShell.doParse(CpsGroovyShell.java:142)
at org.jenkinsci.plugins.workflow.cps.CpsGroovyShell.reparse(CpsGroovyShell.java:127)
at org.jenkinsci.plugins.workflow.cps.CpsFlowExecution.parseScript(CpsFlowExecution.java:563)
at org.jenkinsci.plugins.workflow.cps.CpsFlowExecution.start(CpsFlowExecution.java:515)
at org.jenkinsci.plugins.workflow.job.WorkflowRun.run(WorkflowRun.java:336)
at hudson.model.ResourceController.execute(ResourceController.java:107)
at hudson.model.Executor.run(Executor.java:449)
ERROR: Maximum checkout retry attempts reached, aborting
org.codehaus.groovy.control.MultipleCompilationErrorsException: startup failed:
WorkflowScript: Loading libraries failed

1 error

at org.codehaus.groovy.control.ErrorCollector.failIfErrors(ErrorCollector.java:309)
at org.codehaus.groovy.control.CompilationUnit.applyToPrimaryClassNodes(CompilationUnit.java:1107)
at org.codehaus.groovy.control.CompilationUnit.doPhaseOperation(CompilationUnit.java:624)
at org.codehaus.groovy.control.CompilationUnit.processPhaseOperations(CompilationUnit.java:602)
at org.codehaus.groovy.control.CompilationUnit.compile(CompilationUnit.java:579)
at groovy.lang.GroovyClassLoader.doParseClass(GroovyClassLoader.java:323)
at groovy.lang.GroovyClassLoader.parseClass(GroovyClassLoader.java:293)
at org.jenkinsci.plugins.scriptsecurity.sandbox.groovy.GroovySandbox$Scope.parse(GroovySandbox.java:163)
at org.jenkinsci.plugins.workflow.cps.CpsGroovyShell.doParse(CpsGroovyShell.java:142)
at org.jenkinsci.plugins.workflow.cps.CpsGroovyShell.reparse(CpsGroovyShell.java:127)
at org.jenkinsci.plugins.workflow.cps.CpsFlowExecution.parseScript(CpsFlowExecution.java:563)
at org.jenkinsci.plugins.workflow.cps.CpsFlowExecution.start(CpsFlowExecution.java:515)
at org.jenkinsci.plugins.workflow.job.WorkflowRun.run(WorkflowRun.java:336)
at hudson.model.ResourceController.execute(ResourceController.java:107)
at hudson.model.Executor.run(Executor.java:449)
Finished: FAILURE
mayurbelur_mohan
Product and Topic Expert
Product and Topic Expert
0 Kudos
seems like its fetching some wrong repo.  you are fetching some wrong Git Repo
stderr: error: unable to create file integration/testdata/TestMavenIntegration/cloud-sdk-spring-archetype/application/src/main/java/mydemo/controllers/HelloWorldController.java: Filename too long
error: unable to create file integration/testdata/TestMavenIntegration/cloud-sdk-spring-archetype/application/src/main/java/mydemo/models/HelloWorldResponse.java: Filename too long
error: unable to create file integration/testdata/TestMavenIntegration/cloud-sdk-spring-archetype/integration-tests/src/test/java/mydemo/HelloWorldControllerTest.java: Filename too long
error: unable to create file integration/testdata/TestMavenIntegration/cloud-sdk-tomee-archetype/integration-tests/src/test/java/mydemo/HelloWorldServletTest.java: Filename too long
former_member842591
Discoverer
0 Kudos
Hi Mayur,

In your teched session I couldn't see any steps regarding the shared library. From your blog I have referred to map the git repository below in my Jenkins Global configuration. Could you help me what are the shared libraries to be added for artifact upload,deplopy and get MPL status.
https://github.com/SAP/jenkins-library.git
mayurbelur_mohan
Product and Topic Expert
Product and Topic Expert
0 Kudos
Shared library repo sample i already given in the above blog. for example

integrationArtifactDeploy -- https://github.com/mayurmohan/IntegrationArtifactDeploy
former_member37947
Participant
0 Kudos
Hi mayurbelur.mohan

Great blog!

I´ve trying to get this working in my own VM.

Does this work on a Jenkins running on windows?

I get the following error when deploying:


I´ve reading about this error and looks like under the hood my jenkins pipeline may throwing "sh" commmands.

Any clues?

Thanks in advace

 

 
former_member37947
Participant
0 Kudos
This command may be the solution to "long paths" if you are running the Jenkins in Windows.
git config --system core.longpaths true
cmiron
Explorer
Hi mayurbelur.mohan

Great post. Would you mind telling me how you created the apimApiServiceKeyCredentialsId for working with proxies?
Thank you in advance.