cancel
Showing results for 
Search instead for 
Did you mean: 

Smart Data Streaming - File/Hadoop Json - Array Issue

Former Member
0 Kudos
677

My question regards the File/Hadoop JSON Input adapter and in particular the possibility to import information stored in arrays into hana. I have built a very simple smart data streaming project where I have an input adapter (i.e. File/Hadoop JSON Input) followed by an input stream. As a JSON I am using the JSON file given in the SAP HANA Smart Data Streaming: Adapaters Guide on page 168. Here is what it looks like

{

"firstName": "John",

"lastName": "Smith",

"phoneNumbers": [

{

"type": "home",

"number": "212 555-1234"

},

{

"type": "fax",

"number": "646 555-4567"

}

],

"friends": [

["female1","female2","female3"],

["male1","male2","male3"]

]

}

What I am trying to achieve is to get all the information stored in one of the friend’s arrays. Following the instruction given in the adapters guide I set friends[1] for the jsonRootPath and * for jsonColsMappingList in the properties of the File/Hadoop JSON Input adapter. Yet, I am not able to receive any data in my input stream. Did I get or set something wrong? Appreciate any kind of hint or solution.

Thanks

RobertWaywell
Product and Topic Expert
Product and Topic Expert
0 Kudos

What error do you see in the log files? For example, is the file being found?

View Entire Topic
Former Member
0 Kudos

Hi, I have the same problem. Can anyone provide an example of the adapter usage?

Thanks

RobertWaywell
Product and Topic Expert
Product and Topic Expert

Same question for you as the original poster: What error(s) do you see in the log file? For example, is the file with the json content being found?

Testing this today using the following CCL:

CREATE INPUT WINDOW InputWindow1 SCHEMA (
	firstname string ) PRIMARY KEY ( firstname ) ;


ATTACH INPUT ADAPTER File_Hadoop_JSON_Input1 TYPE toolkit_file_json_input
TO InputWindow1
PROPERTIES
	jsonColsMappingList = 'firstname' ,
	dir = '/file_hadoop_json_input_test/streaming_data_files/' ,
	file = 'person.json' ,
	dynamicMode = 'dynamicFile' ,
	removeAfterProcess = FALSE ,
	pollingPeriod = 10 ;

and with the "person.json" file placed in the "/hana/data_streaming/PM1/adapters/default/file_hadoop_json_input_test/streaming_data_files" directory

(Note: "/hana/data_streaming/<SID>/adapters/<workspace>" is the sandbox directory for SDS)

Then I see this error in the project .out file:

06-01-2017 15:36:17.928 INFO [main] (FileInputTransporter.init) /hana/data_streaming/PM1/pm1/adapters/default/file_hadoop_json_input_test/streaming_data_files
06-01-2017 15:36:17.937 ERROR [main] (TransporterWrapper.init) Exception is thrown
java.lang.Exception: Error code:401001, Severity : 3 (Error)
Error message:File person.json doesnt exist.
Error description:File person.json doesnt exist.
	at com.sybase.esp.adapter.transporters.file.FileInputTransporter.init(FileInputTransporter.java:206)
	at com.sybase.esp.adapter.framework.wrappers.TransporterWrapper.init(TransporterWrapper.java:61)
	at com.sybase.esp.adapter.framework.internal.Adapter.init(Adapter.java:216)
	at com.sybase.esp.adapter.framework.internal.AdapterController.executeStart(AdapterController.java:257)
	at com.sybase.esp.adapter.framework.internal.AdapterController.execute(AdapterController.java:156)
	at com.sybase.esp.adapter.framework.Framework.main(Framework.java:62)
06-01-2017 15:36:17.945 INFO [main] (ContextHandler.doStop) Stopped o.e.j.s.ServletContextHandler@6ee477fc{/,null,UNAVAILABLE}
06-01-2017 15:36:17.949 INFO [main] (AbstractConnector.doStop) Stopped ServerConnector@5756920f{HTTP/1.1}{10.173.72.77:19082}

In this case, the server appears to be putting an extra "pm1" directory in the file path.

I'll look into the error that I'm seeing, but it would be good to know if you are seeing a similar issue.

Thanks

Pai_N
Explorer
0 Kudos
RobertWaywell , I have similar requirement. is regarding JSON Input Kafka Adapter. I want to know the possibility to import JSON array form kafka adapter. Sample JSON want to import from kafka adapter."items": [ { "item‑id":"10", "item‑start‑date":"2024‑08‑07T07:00:00", "item‑end‑date":"2025‑08‑07T06:59:59", "sales_ord_num":"1234545457", "quantity":12 }, { "item‑id":"20", "item‑start‑date":"2024‑08‑07T07:00:00", "item‑end‑date":"2025‑08‑07T06:59:59", "sales_ord_num":"1234545458", "quantity":12 }, { "item‑id":"30", "item‑start‑date":"2024‑08‑07T07:00:00", "item‑end‑date":"2025‑08‑07T06:59:59", "sales_ord_num":"1234545459", "quantity":12 }, { "item‑id":"40", "item‑start‑date":"2024‑08‑07T07:00:00", "item‑end‑date":"2025‑08‑07T06:59:59", "sales_ord_num":"1234545455", "quantity":12 }, { "item‑id":"50", "item‑start‑date":"2024‑08‑07T07:00:00", "item‑end‑date":"2025‑08‑07T06:59:59", "sales_ord_num":"1234545454", "quantity":12 } ] How I can import all 5 Lines from array items[] into Input Stream. Can any one please suggest.
RobertWaywell
Product and Topic Expert
Product and Topic Expert
0 Kudos
@Pai_N - You should post your question as a new thread both because you are using a different adapter than was discussed in this thread and because you are asking how to process the json payload rather than having an issue with the adapter retrieving the json file. Please also note that HANA streaming analytics will reach end of mainstream maintenance in 2025 and will not be extended. SAP is not recommending initiating any new projects using SAP HANA streaming analytics. https://me.sap.com/notes/3440255
Pai_N
Explorer
0 Kudos
Thank you for clarifying the previous points. I have a question regarding importing a JSON array into an input stream. I’ve started a new thread here: Smart Data Streaming Kafka Adapter - Importing JSON Array. I would appreciate it if you could take a look. Additionally, thank you for informing me about the upcoming end of maintenance for SAP. Could you please suggest alternative tools for streaming analytics?
Pai_N
Explorer