cancel
Showing results for 
Search instead for 
Did you mean: 

Full DTP taking too much time to load

Former Member
0 Kudos

Hi All ,

I am facing an issue where a DTP is taking too much time to load data from DSO to Cube via PC and also while manually running it.

There are 6 such similar DTP's which load data for different countries(different DSO's and Cubes as source and target respectively) for last 7 days based on GI Date. All the DTP's are pulling almost same no. of records and finish within 25-30 min. But only one DTP takes around 3 hours. The problem started couple of days back.

I have change the Parallel processes from 3->4->5 and packet size from 50,000->10,000->1.00.000 but no improvement. Also want to mention that all the source DSO's and target Cubes have the same structure. All the transformations have Field Routines and End Routines.

Can you all please share some pointers which can help.

Thanks

Prateek

Accepted Solutions (0)

Answers (3)

Answers (3)

Former Member
0 Kudos

Please check the dimension which are supposed to be line item and also try reloading the cube after deleting index.

Former Member
0 Kudos

Hi,

Reduce DTP size from 50,000 to 5000........

Increasing the umber of records in DTP will have adverse impact on dataload.

Try to reduce and check.

Regards

Mayank

Former Member
0 Kudos

Hi Mayank ,

I changed the DTP size from 50,000 to 10,0000 and the DTP now completes within 2.5 hrs compared to more than 4 hours initially. This serves my initial purpose of saving the PC SLA. But I believe this is a workaround as I have other DTP's in the same PC which have size as 50,000 and they complete in 25 min. Any pointers why this is happening with only this DTP ?

Also while running SAP_INFOCUBE_DESIGNS for other cubes I get the same % which is greater than 100% but they are not impacted. Why ?

Thanks

Prateek

purvang_zinzuwadia
Active Participant
0 Kudos

Hi Prateek,

I believe this particular DTP is taking more time because for the set of records it is fetching, logic in transformation is taking longer.

I suppose you are reading lot of data from other targets as well in transformation field/start/end routine.

If you are not using Semantic Group key in DTP, consider using this option. This may drastically reduce the time of execution.

apart from this, also consider the design improvement suggested by others.

hope this helps,

Purvang

Former Member
0 Kudos

Hi Prateek,

Data loading time depends on many factors like, routines, data volume, background jobs availability of resource etc. Reducing the DTP size helps in data processing faster if you have complexity involve in transformation.

I hope you are deleting Indexes before loading in the Infocube. If not delete the Index as well and then load.....It will definalty improve load performance in cube.

Regards

Mayank

RamanKorrapati
Active Contributor
0 Kudos

Hi,

How much data at your target cube level.

Please check dimension sizes by using report - SAP_INFOCUBE_DESIGNS.

If any one dimension exceeds 20% of the fact table then you need to introduce line item dim with related info object. mean in line item dimension you need to put only one char info object which occupying more space.

Even while during your load you can cross check available application servers.

for Example - if you set parallel process are 6 and available application are 3 then you may face performance issue. if you have 6 or >6 application servers as free then your load will be finished in time.

T code - SM50.

Thanks

Former Member
0 Kudos

HI Raman ,

This is what I get when I check the report. Can this be causing issues as 2 rows have % >= 100

ETVC0006           /BIC/DETVC00069     rows:      1.484    ratio:          0  %

ETVC0006           /BIC/DETVC0006C     rows: 15.059.600    ratio:        103  %

ETVC0006           /BIC/DETVC0006D     rows:        242    ratio:          0  %

ETVC0006           /BIC/DETVC0006P     rows:         66    ratio:          0  %

ETVC0006           /BIC/DETVC0006T     rows:        156    ratio:          0  %

ETVC0006           /BIC/DETVC0006U     rows:          2    ratio:          0  %

ETVC0006           /BIC/EETVC0006      rows: 14.680.700    ratio:        100  %

ETVC0006           /BIC/FETVC0006      rows:          0    ratio:          0  %

ETVC0007           rows: 13.939.200    density:              0,0  %

RamanKorrapati
Active Contributor
0 Kudos

You need to change one of the dimension to line item dimension.

So please check the info object which is occupying more space in that dimension(which >20%).

so you need to create line item dim and insert the char info object which occupied more space  in dimension.

KodandaPani_KV
Active Contributor
0 Kudos

Hi,

please make the line item dimension for specific dimension.

This means the dimension contains precisely one characteristic.

the system does not create a dimension table. Instead, the SID table of the characteristic takes on the role of dimension table. Removing the dimension table has the following advantages:

¡When loading transaction data, no IDs are generated for the entries in the dimension table.

This number range operation can compromise performance precisely in the case where a degenerated dimension is involved.

Thanks,

Phani.