cancel
Showing results for 
Search instead for 
Did you mean: 

DTP step stuck at Extract. from Datastore step in Data package

srini_walgreens
Explorer
0 Kudos
189

Hi Experts,

We are performing a delta data load from one ADSO to another ADSO and this runs like 8 times everyday. But during one of the loads the DTP is taking more time at Extract. from Datastore step in the Data Package. There is code in start and end routine, but it is taking time at the start only. 

The system has ample work processes available, and the tables used in the code for lookup doesn't have any load or activation while the actual load is in progress. We even got the trace enable and notice it is taking more time at the step DB:open.

We have 4 app servers and have allocated 25 work processes for this DTP. only thing is that there are multiple (around 4-5) activation jobs each having allocated 25 WPs trigger during this load. The load used to take 35-40 mins earlier is now taking 80-100 mins with no increase in data volume.

Any insights on this issue will be really helpful. 

Regards,

Srinivas

srini_walgreens
Explorer
0 Kudos
Hi all, The issue has been identified. the reason this is getting stuck at Extract step is that the tables that are part of the start routine lookup are being accessed by another program in a different load. This was not found out even in RSAABAP table as the current load is ABAP and the other load is done via hana procedure. If we cancel the second load mine is running faster now.
View Entire Topic
MKreitlein
Active Contributor
0 Kudos

Hello @srini_walgreens 

unfortunately I don't understand every detail you describe ... e.g. what about the activation processes during data extraction? For me, your description should be a bit more precise.

Questions: Why do you use 25 processes in parallel for extraction? Do you have a package size of 50.000 records or higher? How many records are transferred in these 8 different delta extractions during the day?

I would try to reduce number of parallel extraction processes to 5 max, but increase the package size to 250.000 ... depending of your width of one record (no of fields).

I would not have any other idea.

BR, Martin

srini_walgreens
Explorer
0 Kudos

Hi MKreitlein,

Sure, please find the details below.

what about the activation processes during data extraction? - These are no other resource intensive jobs which run in parallel to the mentioned load, but couple of activation jobs that gets triggered. The ACT jobs have 25 WPs each set as default at system level. Just one of the reasons to suspect as these jobs might occupy WPs in the system and slow down the job processes. Hence gave this detail

Questions: Why do you use 25 processes in parallel for extraction? Do you have a package size of 50.000 records or higher? How many records are transferred in these 8 different delta extractions during the day? This was the default size of the DTP, the volume it processes is also high so we left it 25 and the package size is 100,000. The no of records that are transferred in other batches varies from 30 million to 80 million but none takes this much amount of time for processing.

I would try to reduce number of parallel extraction processes to 5 max, but increase the package size to 250.000 ... depending of your width of one record (no of fields). The number of fields in the target ADSO are not many. So would try this and see if it helps.

 

Regards,

Srinivas