Technology Blogs by Members
Explore a vibrant mix of technical expertise, industry insights, and tech buzz in member blogs covering SAP products, technology, and events. Get in the mix!
cancel
Showing results for 
Search instead for 
Did you mean: 
MarianVatafu
Explorer
3,877

Scenario Overview


Using the standard monitoring capabilities from SAP Integration Suite sometimes has its downsides. Each one of us has faced at some point the need to persist or store trace logs or even save the logs of a job that's running outside of business hours.

Together with my colleague and dear friend codrinz we started to think at way to persist the much needed trace logs that are only available for one hour on the SAP Integration Suite Tenant.

This idea came from a need we had while doing tests on several artifacts, and because they were scheduled somewhere around 4 AM I was not there when they ran and in the morning we had to face plain/short error messages.

Using the standard payload logging as attachments has its limitations, because in the end attachments would pile up in the backend and it will slow down the tenant, so we had to think at a different way of accessing the much-needed logs.

By integrating SAP Business Technology Platform products we have managed to create an application that can schedule a trace log for a specific iFlow artifact, scan for any runs in the specified interval and save it into a SAP HANA Database. Also, with the help of openAI, we get suggestions on how to fix the captured errors.

All the SAP products are being used on a trial environment, so if you do want to do anything similar to what we did make sure to read the SAP Tutorials for Developers.

Architecture Overview


The following picture gives an overview of the solution architecture :

 


Solution Overview


 

The end user can use the html page exposed by the SAP CAP application for configuring and inspecting the logs saved.

 


Video short demo



Technical setup


SAP CAP Application Backend


 


Trial dev space


The application was created in SAP Business Application Studios using NodeJS, express and HDB connections to the SAP HANA Database and it consists of creating multiple services and functions which are being called from the html page and SAP Integration Suite.


Below are the entities we exposed in the catalog service, which are being used to communicate with the database.


Catalog services


 

The services are binded to the data model created below.


Data model


By also creating some dummy CSVs for sample data, we were able to push the application to Cloud Foundry, generate the HDB container/tables and deploy it.

SAP CAP Application Frontend



Application Home Screen


 

The frontend was designed in HTML combined with Javascript. It consists of 3 pages and each one has buttons that trigger several REST calls to the SAP HANA Database or to the SAP Integration Suite endpoint that controls the behavior of several functionalities.

 

Below is a sample Javascript code used to populate, filter and update a table .
<input type="text" id="searchInput" placeholder="Search for iFlow name..." onkeyup="filterTable()">
<br><br>

<table id="dataTable">
<thead>
<tr>
<th onclick="sortTable(0)">iFlow name</th>
<th onclick="sortTable(1)">StartDate</th>
<th onclick="sortTable(2)">StartTime</th>
<th onclick="sortTable(3)">EndDate</th>
<th onclick="sortTable(4)">EndTime</th>
<th>Action</th> <!-- Add a new column for delete buttons -->
</tr>
</thead>
<tbody id="dataBody">
<!-- Table content will be added dynamically -->
</tbody>
</table>

<!-- Sorting options dropdown -->
<div class="sorting-options" id="sortingOptions">
<a href="#" onclick="sortTable(0, 'asc')">Sort Ascending</a>
<a href="#" onclick="sortTable(0, 'desc')">Sort Descending</a>
<a href="#" onclick="sortTable(1, 'asc')">Sort Ascending</a>
<a href="#" onclick="sortTable(1, 'desc')">Sort Descending</a>
<a href="#" onclick="sortTable(2, 'asc')">Sort Ascending</a>
<a href="#" onclick="sortTable(2, 'desc')">Sort Descending</a>
<a href="#" onclick="sortTable(3, 'asc')">Sort Ascending</a>
<a href="#" onclick="sortTable(3, 'desc')">Sort Descending</a>
<a href="#" onclick="sortTable(4, 'asc')">Sort Ascending</a>
<a href="#" onclick="sortTable(4, 'desc')">Sort Descending</a>
</div>

<script>
// Function to delete a row when the delete button is clicked
function deleteRow(row) {
const iFlowName = row.cells[0].textContent.trim(); // Extract iFlow name from the first column

// Make a GET request with query parameters to delete the corresponding iFlow
deleteIFlow(iFlowName);

// Remove the row from the table
row.remove();
}

// Function to make a GET request to delete an iFlow
function deleteIFlow(iFlowName) {
// Define the URL and headers for the GET request
const apiUrl = 'https://${DispatcherURL}';
const apiKey = '${apiKey}';
const step = 'DeleteScheduleTraces';

// Construct the full URL with query parameters
const url = `${apiUrl}?Step=${step}&Name=${iFlowName}`;

// Fetch data using the GET request
fetch(url, {
method: 'GET',
headers: {
'ApiKey': apiKey
}
})
.then(response => {
if (!response.ok) {
throw new Error('Network response was not ok');
}
return response.text();
// alert(response.text())
})
.then(data => {
// Handle the response as needed (e.g., display a confirmation message)
const resp2 = data;
alert(resp2);
console.log('iFlow deleted:', data);

})
.catch(error => {
const resp3 = 'Error deleting iFlow: ${error}';
alert(resp3);
console.error('Error deleting iFlow:', error);
});
}
// Function to filter the table based on the search input
function filterTable() {
const searchInput = document.getElementById("searchInput");
const filterText = searchInput.value.trim().toUpperCase();
const table = document.getElementById("dataTable");
const rows = table.getElementsByTagName("tr");

for (let i = 1; i < rows.length; i++) { // Start from index 1 to skip header row
const cells = rows[i].getElementsByTagName("td");
let matchFound = false;

// Only search in the first column (iFlow Name)
const textValue = cells[0].textContent || cells[0].innerText;

// Perform a partial search (check if the text contains the filter text)
if (textValue.toUpperCase().includes(filterText)) {
matchFound = true;
}

if (matchFound) {
rows[i].style.display = "";
} else {
rows[i].style.display = "none";
}
}
}

// Function to make the GET request and populate the table
function loadTableData() {
// Define the URL and headers for the GET request
const apiUrl = '${apiUrl}?Step=ScheduledTraces';
const apiKey = '${apiKey}';

// Fetch data using the GET request
fetch(apiUrl, {
method: 'GET',
headers: {
'ApiKey': apiKey
}
})
.then(response => {
if (!response.ok) {
throw new Error('Network response was not ok');
}
return response.text();
})
.then(data => {
// Parse the XML response
const parser = new DOMParser();
const xmlDoc = parser.parseFromString(data, 'text/xml');
const iFlows = xmlDoc.querySelectorAll('iFlow');

// Populate the table with data
const tableBody = document.getElementById('dataBody');
tableBody.innerHTML = ''; // Clear existing rows

iFlows.forEach(iFlow => {
const iFlowName = iFlow.querySelector('iFlowName').textContent;
const startDate = iFlow.querySelector('StartDate').textContent;
const startTime = iFlow.querySelector('StartTime').textContent;
const endDate = iFlow.querySelector('EndDate').textContent;
const endTime = iFlow.querySelector('EndTime').textContent;

// Create a new table row and cells
const row = document.createElement('tr');
const iFlowNameCell = document.createElement('td');
const startDateCell = document.createElement('td');
const startTimeCell = document.createElement('td');
const endDateCell = document.createElement('td');
const endTimeCell = document.createElement('td');
const deleteCell = document.createElement('td'); // Add a cell for the delete button

// Set cell values
iFlowNameCell.textContent = iFlowName;
startDateCell.textContent = startDate;
startTimeCell.textContent = startTime;
endDateCell.textContent = endDate;
endTimeCell.textContent = endTime;

// Create a delete button and set its attributes
const deleteButton = document.createElement('button');
deleteButton.textContent = 'Delete';
deleteButton.classList.add('delete-button'); // Add a class to identify the delete button
deleteButton.addEventListener('click', () => {
// Call the deleteRow function when the delete button is clicked
deleteRow(row);
});

// Append cells to the row
row.appendChild(iFlowNameCell);
row.appendChild(startDateCell);
row.appendChild(startTimeCell);
row.appendChild(endDateCell);
row.appendChild(endTimeCell);
deleteCell.appendChild(deleteButton); // Append the delete button to the cell
row.appendChild(deleteCell); // Append the delete button cell to the row

// Append the row to the table body
tableBody.appendChild(row);
});

// After populating the table, call the filter function to show all data
filterTable();
})
.catch(error => {
console.error('Error loading table data:', error);
});
}

// Call the function to load table data when the page loads
window.addEventListener('load', loadTableData);
</script>
<button onclick="exportTableToExcel()">Export to Excel</button>
<P></P>

<script>
// Function to export the table data to an Excel file
function exportTableToExcel() {
const table = document.getElementById("dataTable");
const ws = XLSX.utils.table_to_sheet(table);

// Create a new workbook and add the worksheet
const wb = XLSX.utils.book_new();
XLSX.utils.book_append_sheet(wb, ws, "Table Data");

// Save the workbook as an Excel file
XLSX.writeFile(wb, "table_data.xlsx");
}
</script>

 

There are several javascript codes involved in coding the buttons and tables. All the REST calls are made using the Express application framework, and the CSS is pretty simple, using public bootstrap and custom made layouts.

SAP HANA Database


We are using a trial SAP HANA Database instance on Cloud Foundry, and as a prerequisite for this we had to create SAP HANA Schemas & HDI Container + SAP HANA Cloud services.

The database is a simple one, without a Data Lake, in which we created a table in order to store the logs.


SAP HANA Database


 

For testing locally, we used a sample csv in order to generate my hdbcalculationviews and tables.

For deployment to Cloud Foundry we used the MTA Module Template to generate the mtar file, and then with mbt build and cf deploy commands we were able to deploy it to Cloud Foundry.

openAI Chat Completions API


We have used the API to get suggestions on the errors captured by the scheduled trace. The suggestions are purely as a point of reference, since the generative AI uses public knowledge related common issues and errors.
{
"model": "gpt-3.5-turbo",
"messages": [
{
"role": "user",
"content": "Write advice for a developer on how to fix this SAP CPI error, maximum ${property.lengthOfResponse} characters: ${in.body}"
}
]
}

Sample request body



SAP API Management and SAP Integration Suite


In order to access SAP Integration Suite endpoints, we had to run the call through an API created with SAP API Management, because we had to cope with the CORS policies.

 


API Created


 

All the steps required to do this are listed in this tutorial, and it involves creating an API, a Product and generate an APIKey for it.

All the REST calls coming from the frontend run through SAP API Management and end up in a Dispatcher that routes the message to the specific iFlow based on the query parameter that contain a step name.
const apiUrl = "https://${apiURL}?Step=GetTenantIflowList";
const apiKey = "${APIKEY}";

fetch(apiUrl, {
method: "GET",
headers: {
"ApiKey": apiKey
},
})

Sample code for a REST call



SAP Integration Suite Calls Dispatcher


 

Email integration


Besides having the full table available with detailed errors, we have added the functionality of sending an email containing all the neccesary details for solving and identifying the issue to a list of recipients. We thought this application could be used by a functional consultant that has no technical knowledge to solve the SAP Integration Suite issue, and it would be helpful for him to send the details to a developer that has the proper knowledge to fix it, so we used the public Gmail SMTP to send mails. Below is a screenshot with one of the emails sent by the application.

 


Email sent by the application




Conclusions


All these functionalities we have created would help not only the integration developers, but also functional consultants. The tool can be really helpful in UAT phases as well, since you can store indefinitely the step contents and be able to access all the data that is flowing through the integration artifacts.

Of course, it might have some limitations regarding performace, since we used some dummy artifacts that are relatively small in size and do not deal with huge payloads, but in time it might be tweaked in a way that you could control what is being saved, what steps to watch and what headers/properties to log.

I hope you liked the article and feel free to reply with any suggestions here or on in the SAP Integration Suite Question section.

 

Thank you !
2 Comments
Labels in this area