on 2019 Sep 25 8:36 PM
Hello,
I am having an issue while process a DB to IDOC interface in SAP PO75.
I have to process huge data of 300mb mapping logic includes 'Variables' where i save data from source at run time and use that variables values to other target fields in mapping.
So Issue is: when i process data with <50MB, interface was successful. but when i process data of 300mb at a time, it shows error with variable field as 'Queue do not have same number of values'. So i suspect that when i process huge size message, mapping is not able to accommodate all values with that variable.
Can anyone help me to understand/how to change the Heap size or how to allocate more heap size for mapping objects.
any specific java parameters to add more run time memory for mapping object? or let me know if issue is due to some other reasons.
we had java heap size of 3gb already. but how to increase for runtime memory for ESR mapping object.
Thank you.YQ
Hi Majumder/Talasila,
Thank you for Quick suggestions.
we had done the SAP PI server sizing and tuned system to handle huge data without getting out of memory error and message blocking.
And issue is with ESR -Mapping,
1. while in run time the 'variable' (standard option available in mapping) that i had added is not able to handle/retain huge data /records seems.
2. the above issue may be due to Format by example function that I used for 'variable' filed mapping logic. means FormatByEample is not able to handle huge data with 100,000 records..!. (here no issue with data because source and target fields mapped with formarbyexample are having same record count)
so is there any specific java property to change in NWA to provide more java memory for Mapping tool?
And also i am not able to split message as i am grouping the records received from source system based on conditions. so we can not predict the records may occur at any place in source data (logic may include to group records of same sales group. and that sales group records may occur in first or may be at last record out of 100,000 records.)
Please suggest.
Best Regards . YQ
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
Hi Yeshua,
I suggest to use process the data in chunks to optimize the message size, instead of processing huge records in one go..
Check the below links for your reference.
https://answers.sap.com/questions/8558049/pi-71-jdbc-sender-adapter-huge-load-from-db-select.html
https://answers.sap.com/questions/7561264/sending-data-in-chunks-in-jdbc-adapter-with-rollba.html
or
If you can generate the file from the database, pick the file using file adapter and process the file in chunks.
https://blogs.sap.com/2010/10/18/pixi-pi-73-processing-of-large-files-teaser/
Regards
Bhargava Krishna
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
You need to check with your basis team for scaling and sizing. if the data from source is 300 mb, Using PI might not be the best option . you can use other ETL tool.
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
User | Count |
---|---|
67 | |
11 | |
10 | |
10 | |
9 | |
9 | |
6 | |
5 | |
4 | |
4 |
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.