
SAP’s Integration Suite (CPI) gives you visibility — but only just. If you want to check whether a message was delivered or inspect its Custom Header Properties, you can. But it’s manual. One message at a time. No consolidated view. No report.
Cloud ALM improves things slightly by centralising errors and alerts, but when it comes to reporting on message metadata like Custom Header Properties, you're stuck. Neither CPI’s monitor nor Cloud ALM’s Integration & Exception Monitor gives you a way to extract all this data in bulk.
And that’s a problem. Because sometimes your business needs answers like:
For many organisations, this isn’t just a nice-to-have — it’s a compliance or contractual requirement.
This is where the gap becomes critical: the data is there, but not accessible in a way that supports operational oversight, reporting, or auditing.
So, what’s the fix?
You build your own reporting pipeline — extracting message data via CPI’s APIs, enriching it with Custom Header Properties, and loading it into SAP Analytics Cloud (SAC).
To get the data you need out of SAP Cloud Integration (CPI), you’ll be working with two key APIs:
With basic authentication method, you can quickly test the API, however it's recommended to use a more secured method like OAuth 2.0 the way it's demonstrated in the attached script.
import requests from requests.auth import HTTPBasicAuth cpi_host = 'https://your-cpi-host.it-cpi.eu10.hana.ondemand.com' username = 'your-technical-user' password = 'your-password'
response = requests.get( f"{cpi_host}/api/v1/MessageProcessingLogs?$format=json", auth=HTTPBasicAuth(username, password) ) messages = response.json().get('d', {}).get('results', [])
all_data = [] for msg in messages: guid = msg.get('MessageGuid') hdr_response = requests.get( f"{cpi_host}/api/v1/MessageProcessingLogs('{guid}')/CustomHeaderProperties?$format=json", auth=HTTPBasicAuth(username, password) ) custom_headers = hdr_response.json().get('d', {}).get('results', []) record = { 'MessageGuid': guid, 'IntegrationFlowName': msg.get('IntegrationFlowName'), 'Timestamp': msg.get('LogStart'), 'CustomHeaderProperties': {hdr['Name']: hdr['Value'] for hdr in custom_headers} } all_data.append(record)
import pandas as pd df = pd.json_normalize(all_data, sep='_') df.to_csv('cpi_message_headers.csv', index=False)
To import the data you extracted out of SAP Cloud Integration (CPI) in SAP Analytics Cloud (SAC), you’ll be working with the following APIs:
However, there is a simple and quick way to import the data in one click mode with the api/v1/import/{modelID} endpoint. This method is demonstrated in the sample attached script.
Before you can load data into SAC, you must ensure the model exists and that you’ve verified its metadata.
model_id = 'your-model-id' sac_host = 'https://<your-sac-tenant>.sapbusinessobjects.cloud' token = 'your-bearer-token' headers = { 'Authorization': f'Bearer {token}' } response = requests.get( f'{sac_host}/api/v1/dataimport/models/{model_id}', headers=headers ) print(response.json())
meta_response = requests.get( f'{sac_host}/api/v1/dataimport/models/{model_id}/metadata', headers=headers ) metadata = meta_response.json()
import json job_payload = { "importType": "Append" # or "Full" } create_job_response = requests.post( f'{sac_host}/api/v1/dataimport/models/{model_id}/importType', headers={**headers, 'Content-Type': 'application/json'}, data=json.dumps(job_payload) ) job_info = create_job_response.json() job_id = job_info.get('id') print(f"Created import job with ID: {job_id}")
data_payload = { "data": [ { "MessageGuid": "12345", "IntegrationFlowName": "InvoiceDispatch", "Timestamp": "2024-12-01T10:15:00Z", "BusinessObjectID": "INV-1001" }, { "MessageGuid": "12346", "IntegrationFlowName": "InvoiceDispatch", "Timestamp": "2024-12-01T10:16:00Z", "BusinessObjectID": "INV-1002" } ] }
load_response = requests.post( f'{sac_host}/api/v1/dataimport/jobs/{job_id}', headers={**headers, 'Content-Type': 'application/json'}, data=json.dumps(data_payload) ) if load_response.status_code == 202: print("Data loaded into job successfully.") else: print(f"Error loading data: {load_response.status_code} - {load_response.text}")
validate_response = requests.post( f'{sac_host}/api/v1/dataimport/jobs/{job_id}/validate', headers=headers ) if validate_response.status_code == 202: print("Validation successful. Ready to run the job.") else: print(f"Validation failed: {validate_response.status_code} - {validate_response.text}")
run_response = requests.post( f'{sac_host}/api/v1/dataimport/jobs/{job_id}/run', headers=headers ) if run_response.status_code == 202: print("Job started successfully.") else: print(f"Failed to start the job: {run_response.status_code} - {run_response.text}")
import time while True: status_response = requests.get( f'{sac_host}/api/v1/dataimport/jobs/{job_id}/status', headers=headers ) status_info = status_response.json() status = status_info.get('status') print(f"Current job status: {status}") if status in ["Completed", "Failed"]: break time.sleep(5)
If you’ve ever felt stuck manually digging through CPI logs just to answer simple business questions, now you know there’s a better way.
By using SAP’s APIs, a little Python, and the SAC Data Import API, you can build a repeatable pipeline to extract, load, and report on Custom Header Properties — at scale.
The steps you followed — extracting with /MessageProcessingLogs, staging data for SAC, validating, running, and monitoring — create a blueprint you can extend for other use cases too.
The data was always there. Now, you control it.
A sample Python script is attached to this blog, however, the script is following the one-click approach to import the data into SAC.
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
User | Count |
---|---|
22 | |
11 | |
11 | |
7 | |
6 | |
6 | |
6 | |
5 | |
5 | |
5 |