on 2022 Jun 29 8:52 PM
Hi,
In my project, I want to use Data services to transfer data coming from ODP table extractor (CDC) to an apache Kafka system. I use data services to make some transformation and validation in between.
It also need to be in real time, so when there is a change in the ODP table, it mean the job must start and transfer it to KAFKA. Is it possible to do? I am not really familiar with real time and the message system required to start a data services real time job.
For Kafka, , I'm not sure how to make it the target. Do I need an adapter or is there an easier option?
I use the latest version of DS14.2
This is the DS process, I put the ODB objects in the enrich DF. when I test manually with SOAP UI, the job get executed. Is there a way to automatize this?
This is the result I want to send to Kafka.
Thank you
Hello,
If your ODP extractor changes what columns are returned, the job cannot accomodate those columns on the fly - the job would have to be manually updated.
Is Apache KAFKA hadoop?
Please check the documentation on connecting to hadoop
Thanks,
Denise
SAP Support
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
Hello,
If the changes in the ODP extractor only concern the fields values, will it be able to give me the change on the fly?
Right now I feel that I need to execute the job manually to return the change. So when I start my real time job, it only catch the change on the first request run, until I end and restart the job manually. The ODP extractor is in change data capture (CDC) mode, so is it possible to make it so that every request return the latest change in the table value?
I'm not sure if the connection is the same as Hadoop, but I can give it a try.
Thanks
User | Count |
---|---|
82 | |
12 | |
10 | |
10 | |
10 | |
9 | |
8 | |
7 | |
5 | |
5 |
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.