on 2023 Feb 21 4:48 PM
Hello Experts,
We have an interface which is triggered by timer event and pulls data from an API and volume is huge. This API returns 10k records in one call and give you next key(10001) to use in next call, so we have put a looping process call. During tests, we found that tenant runs into problem while processing over 400k records. Messages are stuck in processing status and run-time node restarts itself. So we put the condition in looping process to either run 40 times or when NextTableName = -1( no more data). Problem with this is that you have to wait for next runs to load the data in receiver system(sftp).
What we would like is that once this interfaces runs and finishes; then it runs again depending on nextKey value. It keeps doing that till there are no more records. Is it possible and what is the best way to achieve that?
Any suggestions are appreciated.
Thanks,
Hemant
Request clarification before answering.
You did not say if you have any mapping requirements for transforming the API responses. If not, rather than collecting the records in one run, you might want to append them to SFTP file and while there are more records, call the same iflow using a SOAP adapter in asynchronous mode (caller does not wait for the provider to end processing), with HTTP headers for passing parameters. You kick off the process using a timer, which triggers the iFlow and then keeps calling itself in separate instances until all the records are read. Now you have an SFTP file to grab and transform and send to receiver. However, you might want to do that using SFTP look up in the last run of the iFlow when you find there are no more records to fetch. You need to make sure the SFTP file is not too big for CPI to handle though.
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
Hi Vijay,
Thanks for your response.
There is no mapping but we are converting JSON to XML and then CSV and adding csv headers once we are out of looping process call. If I understood correctly, What you are suggesting is that we make 2 iflows? one would have all this processing but can be called using soap/processDirect and another would be triggered using timer and we run the first one inside that in a loop till there is no more data. Is what you are suggesting or something else?
This is current iflow.
Looping call process
Thanks,
Hemant
You will have two iflows.
1. Activated by a time, calls the 2nd iflow asynchronously using SOAP adapter, with HTTP intial header headers .
2. Main iflow that recursively calls itself asynchronously until all the records are written to a file. When called for the first time, this iflo should read the HTTP headers and call the API. Then converts the response JSON to XML, XMl to flat file and then appends the records to a file via SFTP. If there are more records to fetch, calls itself asynchronously by passing the next records to fetch counter/value or however it is. Main point is, these iFlows terminate as soon as the current run is over. No one instance would wait for the other. You are simply handling for example 10k records in every call and not adding up to the memory. When there are no more calls needed for the API, the iflow should then continue to read the SFTP file again (if needed) to add the headers unless you decide to do that in the first iflow.
Hope, I did not confuse you even more.
User | Count |
---|---|
75 | |
30 | |
9 | |
8 | |
7 | |
6 | |
6 | |
5 | |
5 | |
5 |
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.