Technology Blogs by SAP
Learn how to extend and personalize SAP applications. Follow the SAP technology blog for insights into SAP BTP, ABAP, SAP Analytics Cloud, SAP HANA, and more.
cancel
Showing results for 
Search instead for 
Did you mean: 
ChrisQi
Discoverer

Introduction


SAP has recently released Kafka Adapter on both Cloud Foundry and Neo environment . This blog is to describe step by step how to build your first simple demo using Cloud Integration as both Producer and Consumer of Kafka to run scenario end-to-end.


Some important concepts of Kafka and Cloud Integration Kafka Adapter configuration has been described here Cloud Integration – What You Need to Know About the Kafka Adapter


The Cloud Integration Kafka Adapter current has the limitaion that Cloud Connector support is not available. Thus the Kafka used in this article is Kafka Cloud trial.


 

Getting Kafka Cloud Trial


1. Visit https://confluent.cloud/. Register and get free trial (basic version)


You can sign up for Confluent Cloud for free. Confluent Cloud provides a simple, scalable, resilient, and secure event streaming platform for us to easily test the Kafka scenario.

2. Create cluster


Basic version cluster would be sufficient for our demo. $200 USD credit is offered each of your first three monthly bills. So you won't be chareged for the test.



confluent create cluster 2



3. Create Topic


Next essential step will be create topic. It can be created with default configuration. You can also customize the configuration according to requirement.




4. Create Kafka API Key


Then you need to create API key for CI access.



Copy the Key and Secret here. We will need them when creating Kafka credential in Cloud Integration.

 

Design the iFlow in Cloud Integration as Producer for Kafka


1. Set up Kafka Reciever Adapter




a. Copy host from Confluent Setting




b. Authentication


SALS with PLAIN machanism would be the easiest to set up in this scenario.  We need to save Confluent API Credential in CI for access.

To create the credential, we use the API key and secret from abov step as user and password.


 

2. Set up http Sender Adapter to trigger the process



configure the iFlow to be producer of the topic you created in Confluent. Configure the parameters as required.



3. Deploy the iFlow


 

Design the iFlow in Cloud Integration as Consumer (group) for Kafka


1. Set up Kafka Sender Adapter



The host and authentication setting would be same as reciever setting


Set the adapter to be consumer of the topic you created. Configure the other parameters as required.

2. Set up Script to log the message.


import com.sap.gateway.ip.core.customdev.util.Message;
import java.util.HashMap;
def Message processData(Message message) {
def body = message.getBody(java.lang.String) as String;
def messageLog = messageLogFactory.getMessageLog(message);
if(messageLog != null){
messageLog.setStringProperty("Logging#1", "Printing Payload As Attachment")
messageLog.addAttachmentAsString("ResponsePayload:", body, "text/plain");
}
return message;
}

 

3. You can customize the message also and send to any downstream processes.


4. Deploy the iFlow


 

Test the iFlow


1. Send HTTP request from postman



2. See message in Confluent




3. Check iFlow from Cloud Integration



With the above steps you will be able to set up a very simple end-to-end senario to utilize Kafka on Confluent Cloud and Kafka Adapter in Cloud Integration. There are many settings available in the adapter that you can further try and test. The blog is just to help you get started.

Welcome to provide feedback regarding the content in the comment section.

Thanks!
15 Comments