on 2012 Jan 03 4:10 AM
Hi ,
I have a data loading from DSO1 to DSO2.Where i also 4 field level routines to populate some fields using the master .Here my master is of 50,000 to 80,records.
It is taking more than 8 hours for loading 8 lac records.
Can any one guide me to in regard of performance.
Data packet size is 50,000 records/pack.
no sematice key is enabled .
I have read statements in fields routiens.No Loops ...
Regards
Laxman.
Request clarification before answering.
hi Laxman,
Try optimizing your code ,Some points are .
1.If possible go for start or end routine and if not then create global table in start routine and write all select statements at start routine .At field level only create field symbol and read statement .
2. Select only required fields and use corresponding fields of statement in select.
3. Use For All Entries in SOURCE_PACKAGE .
4. Refresh table after updation at end routine .
5. Use field symbols instead of Work area that will make great difference .
6. In read use binary search and before that don't forget to sort data as per keys you are using in read at start routine .
7.Set semantic keys so that all record get sorted as per semantic keys and select with FOR ALL ENTRIES will become more efficient .
8.Field level routine take more time than start or end routine so if possible switch to start routine .
9.If extraction is taking more time than reduce package size .
Regards,
Jaya Tiwari
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
User | Count |
---|---|
78 | |
29 | |
9 | |
7 | |
7 | |
6 | |
6 | |
6 | |
5 | |
5 |
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.