cancel
Showing results for 
Search instead for 
Did you mean: 
Read only

JSON String to Table Data Type

rajeshps
Participant
0 Likes
1,551

Hello Team

I may require your inputs to convert JSON string to table data type. I'm building graph in gen 1 operators of SDI on premise 3.2.

node base operator is use for decoding message and data transform is used to perform data operation within graph

Error

The source and target operators are incompatible for connection. The Data Transform operator supports port of type table.

Group: group1; Messages: Graph failure: failed to deserialize data coming from port 'output' of operator 'python3operator1': failed to deserialize bytes representing type 'string' into type 'vtype.basetypeTable'. Make sure you are sending data that is compatible with the port type 'table'

Process(es) terminated with error(s). restartOnFailure==false

Thanks and Regards,

Rajesh PS

Accepted Solutions (0)

Answers (1)

Answers (1)

werner_daehn
Active Contributor
0 Likes

The first thing that comes to mind is the Kafka standard method of using Kafka Connect for SAP Hana.

https://github.com/SAP/kafka-connect-sap

I assume you know about that, so the next question would be what this approach is lacking for your use case.

werner_daehn
Active Contributor
0 Likes

Kafka Connect and SAP DI are two totally different approaches. I never used DI in projects.

Kafka Connect does use the schema registry. To decode/encode Avro messages from/to Kafka, the Schema registry is a must, because the payload in Kafka contains a magic byte and the subject id as integer. With this value the schema registry is queried and only then you know the schema and the payload. All of that happens fully automatically in every system I have ever seen.