I have been seeing a lot of posts on large file processing. We are running into a large number of issues with enormous input file size. I am having input files with 100MB size and still having issues. Are these related to the 4CPU server that we are using?? We have set up the latest VM settings according to tuning guide couple of months ago..Is there a latest version??
Whenever we run the large file, it trips our J2EE and J2EE restarts on its own...I know 100MB is not a large size but please advise..
your hardware might not be sufficient to run allthe process.though you can install XI with the h/w you have it might not be sufficient to handle the load.so check out the hardware configs.For example if you put a file in test mode,this will increase the queue size and may result in restart or shutdown of some service.
I've got the same problem.
I want to process a file with 150 MB size and during conversion to XML the J2EE engine restarts.
Seems to be a lack of memory.
I tried smaller files like 50 MB and they are sucessfully converted to XML.
Afterwards the server needs 4 hours to transfer the file from the AF to the IE. Then the mapping is started and runs for about 7 hours. Its just a simple XSL mapping so there should be no problem at all. But after seven hours I get an out of memory error and the processing of the message is canceled.
My J2EE engine has 1024 MB of heap. I can't increase the heap of this server process but I could try to setup a new server process to add some memory for the java engine.
Does anybody know if adding a server process would help in this situation?
I also tried to split the file into small XML messages by using the parameter Recordsets per message.
It didn't work. I still got one XXL XML message. No splitting was done. Has anybody worked out how to get this feature running?