2005 Aug 29 9:20 AM
I´m using gui_upload to upload files to the application server. When I´m trying to trasfer files bigger than 150MB I get the exception dp_out_of_memory.
2005 Aug 29 9:24 AM
Hi,
You can not run FM gui_upload in background. It's a Gui function.
Svetlin.
2005 Aug 29 9:21 AM
2005 Aug 29 9:24 AM
Hi,
You can not run FM gui_upload in background. It's a Gui function.
Svetlin.
2005 Aug 29 9:27 AM
Use function module C13Z_FRONT_END_TO_APPL to upload data from Presentation server to application server.
2005 Aug 29 9:49 AM
I cannot find such an function module on my system. maybe i should say, that I´m using Web AS 620.
2005 Aug 29 9:54 AM
Try to use
OPEN DATASET <filepath>
READ DATASET <filepath> INTO <itab>.
CLOSE DATASET <filepath>.
loop the <itab>
TRANSFER f TO <filepath>.
regds
gv
2005 Aug 29 10:06 AM
It doesn´t work. Open dataset can´t find the file.
OPEN DATASET file FOR INPUT IN TEXT MODE ENCODING DEFAULT.
IF sy-subrc = 0.
DO.
READ DATASET file INTO wa_s.
IF sy-subrc ' TYPE 'I' NUMBER 001 WITH 'Doesn´t work'.
ENDIF.
2005 Aug 29 10:13 AM
2005 Aug 29 10:28 AM
2005 Aug 29 10:32 AM
That´s true, you can´t run ws_upload in background, there is even a exception:
no_batch = 5
2005 Aug 29 10:34 AM
thanks
Pat,
did u found any solution
regards
Message was edited by: Surpreet Singh Bal
2005 Aug 29 10:38 AM
If the file size is big ask to your basis people to store the file in Application server. They can directly access the application server.
2005 Aug 29 10:39 AM
Hi,
Copy the file to the application server and then use dataset stmt.
Svetlin
2005 Aug 29 10:43 AM
Yes, I know. I do it this way by now. But that´s what i´m trying to avoid.
2005 Aug 29 10:48 AM
2005 Aug 29 12:24 PM
Thanks for the link, but the solution is not applicable. The directory of the file isn´t a network share.
2005 Aug 29 12:41 PM
Hi Patrick,
than two last options you can try:
- ask your basis to increase memory limit (it can easily go up to 1GB, if you have the hardware)
- split the file
But using GUI_UPLOAD requires dialog process -> you always will need as much memory (and some more) as your file has.
Only in batch a line by line execution is possible.
Regards,
Christian
2005 Aug 29 1:22 PM
Hi Christian thanks for your respons.
the memory is enough large, else the exception SYSTEM_NO_ROLL would be raised. I´ve dp_out_of_memory.
spliting file is not an option for me, unless there is a function modul or a class method to do this.
2005 Aug 29 2:41 PM
Hi,
I think that you should change these system parameters(RZ10):
ztta/roll_extension
ztta/max_memreq_MB
Look for relevent oss notes ( sample notes 563688, 425207)
Svetlin
2005 Aug 29 3:05 PM
Hi Patrick,
I'm always getting a 'TVS_NEW_PAGE_ALLOC_FAILED'-dump. There current memory usage is inside, also a hint, which parameters have to be changed.
I think, you are in a similar situation. Have a look, if system parameter abap/heap_area_dia is your limit. There is also a parameter abap/heaplimit, which is limit for one workprocess. Probably this triggers your dump (but all based on my experience with unsuccessful new page allocation).
Maybe your 'no roll' dump is more linked to parameters ztta/roll_area - ask a basis expert / have a look in OSS.
Regards,
Christian
2005 Aug 29 3:58 PM
Hi Patrick,
Do you want the data be written to application server or read data from application server? If your requirement is the first one then you open the file in OUTPUT mode. This mode opens the file if its already there and creates if not. Then you transfer the data from your internal table to this file using TRANSFER command.
The code you have written it to read data from application server but not to write data into it.
Please let me know your requirement.
Thanks
Vamsi
2005 Aug 29 4:10 PM
hi vamsi,
i want to read data from dem presentation server (with gui_upload, ws_upload, ..., but these function modules can´t handle large files) and write it to the application server.
Message was edited by: Patrick Plattner
2005 Aug 29 4:53 PM
Thanks for Sveltin, Wolfgang and Christian.
The basis is on the way to check your inputs.
2005 Aug 29 9:50 AM
Try to use dataset stmt. But in most of the cases, it works only on the aplication server.
Svetlin
2005 Aug 29 2:03 PM
hi, I think you can split the big file into pieces. And upload them into application server.
OPEN DATASET FNAME FOR APPENDING.
Then use this to connect the pieces, recover to the orignial one file.
Hope it will be helpful
thanks
2005 Aug 29 2:52 PM
2005 Aug 29 3:52 PM
Patrick - rather than gui_upload, can you use one of the FTP FMs? Or can basis FTP at the OS level?
Rob
2005 Aug 29 4:06 PM
2005 Aug 29 6:44 PM
Hi Patrick,
The issue is not with the upload function module, it is the internal table memory that is the issue. Looking at the size of the file, I am assuming that the number of records in the file are to the tune of 100 thousand. There is a basis setting for the roll size memory that you can try tweaking.
Just curious, why is the file so big? Do you need to run this periodically or just once? If it is an ongoing process, why are you trying to find a desktop upload functionality, compared to application server upload?
Regards,
Srinivas
2005 Sep 07 12:51 PM
Nothing worked by us. We wrote to the sap directly and even they couldn't help us.
They're preparing a new oss message for gui_upload because of the problem with large files.
Thanks all for your help.
Regards,
Patrick
2005 Sep 07 2:08 PM
Hi ,
a coollegue had the same problem with files >= 200 MB
-> SAP-OSS told him to increase his virtual memory on his PC
Andreas
2005 Oct 18 9:51 AM
Hi,
maybe OSS-note 872457 was created corresponding to this discussion - just like Andreas had already experienced:
Solution
Basically, the front-end file modules are not intended for transferring large volumes of data. Instead, large files should be loaded directly from the application server by using OPEN DATASET. If the files can only be read by using gui_upload, you need to adjust the memory space restriction of the work process correspondingly.
Regards,
Christian
2015 Nov 20 3:24 PM