Application Development and Automation Discussions
Join the discussions or start your own on all things application development, including tools and APIs, programming models, and keeping your skills sharp.
cancel
Showing results for 
Search instead for 
Did you mean: 
Read only

Performance Analysis

Former Member
0 Likes
723

Hi,

I am trying to improve the performance of a program.

The Below query is used inside loop endloop,

SELECT MAX( sptag ) FROM s662 INTO s662-sptag

WHERE matnr EQ itab-matnr

AND vkorg EQ p_vkorg

AND vkbur EQ itab-werks

AND ssour = ' '

AND vrsio EQ '000'.

I am trying to take this query out of the loop to improve performance using hashed tables.

However i cannot do that coz of non unique entries...

The code i wrote is:

DATA: BEGIN OF wa_s662, "header

sptag LIKE s662-sptag,

vkorg LIKE s662-vkorg,

vkbur LIKE s662-vkbur,

matnr LIKE s662-matnr,

END OF wa_s662.

DATA: ht_s662 LIKE HASHED TABLE OF wa_s662

WITH UNIQUE KEY vkorg vkbur matnr WITH HEADER LINE.

**To get Last Sales Date

  • SELECT sptag vkorg vkbur matnr

  • FROM s662 INTO TABLE ht_s662

  • FOR ALL ENTRIES IN imara

  • WHERE ssour = ' '

  • AND vrsio EQ '000'

  • AND vkorg EQ p_vkorg

    • AND vkbur EQ imara-werks

  • AND matnr EQ imara-matnr.

But the above query is dumping becoz of duplicate entries...

Any suggestions please...

5 REPLIES 5
Read only

Former Member
0 Likes
685

Hi Sam,

can you try with deletion of duplicate records .

EX:

SORT ITAB BY F1 F2 ....

delete adjecent duplicates by comparing fiends.

Regards,

Sai

Read only

Former Member
0 Likes
685

i think you can have the internal table ht_s662 as type standard table of itself. so that it will not dump.

Read only

Former Member
0 Likes
685

Hi

You are trying to make it slower than original one by using the hashed table. Hashed table is performance intensive when used for insertion, it should be used as look up table as it will use hashing algorithm to identify the record. I would suggest you look at other options to optimize the code like,

1) Moving the Select outside the loop ... endloop, may be by using FOR ALL ENTRIES clause.

2) Re-phrasing the WHERE clause yo use appropriate primary/secondary indexes.

Regards

Ranganath

Read only

0 Likes
685

Thank you friends for your time.

1. I cannot delete duplicates because the select itself is throwing a dump. I can delete only after seletcing the records in the

itab.

2. Yes I will using the For All Entries parameter.

3. I used hashed table coz the number of entries are very huge to the extent of 30 Million, so if i use standard table, i think it will again cause delays... and reading from taht table inside the loop might casue delays or time outs.

Read only

0 Likes
685

select inside loop is not recommended.

select from both tables outside loop using for all entries , loop the smallest internal table.

inside the loop do one read statement .. performance will increase definitely