cancel
Showing results for 
Search instead for 
Did you mean: 

In Data services, how to make a real time job to transfer data from ODP extractor to apache KAFKA?

0 Kudos
622

Hi,

In my project, I want to use Data services to transfer data coming from ODP table extractor (CDC) to an apache Kafka system. I use data services to make some transformation and validation in between.

It also need to be in real time, so when there is a change in the ODP table, it mean the job must start and transfer it to KAFKA. Is it possible to do? I am not really familiar with real time and the message system required to start a data services real time job.

For Kafka, , I'm not sure how to make it the target. Do I need an adapter or is there an easier option?

I use the latest version of DS14.2

This is the DS process, I put the ODB objects in the enrich DF. when I test manually with SOAP UI, the job get executed. Is there a way to automatize this?

This is the result I want to send to Kafka.

Thank you

Accepted Solutions (0)

Answers (1)

Answers (1)

denise_meyer
Advisor
Advisor
0 Kudos

Hello,

If your ODP extractor changes what columns are returned, the job cannot accomodate those columns on the fly - the job would have to be manually updated.

Is Apache KAFKA hadoop?

Please check the documentation on connecting to hadoop

https://help.sap.com/docs/SAP_DATA_SERVICES/2dce4e02db1e46488049d74cfdd04da5/576493ce6d6d1014b3fc928...

Thanks,
Denise
SAP Support

0 Kudos

Hello,

If the changes in the ODP extractor only concern the fields values, will it be able to give me the change on the fly?

Right now I feel that I need to execute the job manually to return the change. So when I start my real time job, it only catch the change on the first request run, until I end and restart the job manually. The ODP extractor is in change data capture (CDC) mode, so is it possible to make it so that every request return the latest change in the table value?

I'm not sure if the connection is the same as Hadoop, but I can give it a try.

Thanks