‎2008 Apr 15 1:09 PM
Hi,
We have the following query which is causing performance issues. Let me know how can we optimise it.
SELECT
VBELN
POSNR
MATNR
MEINS
VRKME
VSTEL
ABGRU
MVGR5
CUOBJ
NETPR
ZMENG
KWMENG
KPEIN
KMEIN
NETWR
WAERK
Z_SERV_TYP_ORIG
Z_SERV_TYP_DEST
Z_SHIP_TYP_ORIG
Z_SHIP_TYP_DEST
Z_EQUIPMENT_TYPE
Z_CONTAINER_SIZE
Z_CONTAINER_TYPE
Z_OTI
Z_HUB_PORT
Z_PASSTHROUGH
Z_SHIPMENTTERM
ARKTX
KBMENG
INTO TABLE I_VBAP
FROM VBAP
FOR ALL ENTRIES IN I_VBAK
WHERE VBELN = I_VBAK-VBELN
AND MATNR IN S_MATNR
AND Z_SERV_TYP_ORIG IN S_SERTO
AND Z_SERV_TYP_DEST IN S_SERTD
AND Z_SHIP_TYP_ORIG IN S_STO
AND Z_SHIP_TYP_DEST IN S_STD
AND Z_EQUIPMENT_TYPE IN S_EQTY
AND Z_CONTAINER_SIZE IN S_CONTS
AND Z_CONTAINER_TYPE IN S_CONTT
AND Z_VESSEL_CODE IN S_VES
AND Z_VOYAGE_NUMBER IN S_VOY
AND Z_OTI IN S_OTI
AND Z_HUB_PORT IN S_HUB
AND Z_PASSTHROUGH IN S_PASS
AND Z_SHIPMENTTERM IN S_ST.
The only field compulsory in the where clause is the sales order number (VBELN). Rest all fields are optional and may or may not have values in them.
Thanks in advance.
Mick
‎2008 Apr 15 1:16 PM
I would start by sorting the i_vbak table by VBELN, and deleting any duplicates from this table ...
‎2008 Apr 15 1:16 PM
I would start by sorting the i_vbak table by VBELN, and deleting any duplicates from this table ...
‎2008 Apr 15 1:19 PM
well, if I_VBAK is empty or has many (all?) existing entries of VBAK, then you're scanning the entire VBAP table, which will take time no matter what, depending on the number of entries.
At least make sure I_VBAK is not empty and you have it sorted by VBELN. If S_MATNR contains only few material numbers, access via VAPMA should be investigated.
Cheers
Thomas
‎2008 Apr 15 1:20 PM
hi ,
follow the instructions..
1. use the primary and secondary keys in the where condition .
2. use the sort statement
3. use the delete adjacent duplicates if necessary.
please check the optimisation in the st05..and proceed..
regards,
venkat.
‎2008 Apr 15 1:48 PM
Hi,
1) The internal table is already sorted.
2) The where clause contains the sales order no (VBELN)which is part of the primary key. We cannot have the item no (POSNR) in the where clause as we are querying VBAK first and then passing all the sales order nos to VBAP.
3) I_VBAK is not empty
I_VBAK can have a large no of entries.
Please help.
Thanks,
Mick
‎2008 Apr 15 3:21 PM
‎2008 Apr 15 6:55 PM
Hi Mick,
Try using Package size in u r query.Package size would get data into u r internal table in chunks of 2GB.
C below...i have picked it up from one of WIKI
Some times tables can have very large amount of data which may exceed 2GB limit of the internal table and may lead to short dump. In that case we can use the below technique to fetch the data.
It has 3 steps :
1. Calculting the memory occupied by one record.
2. Calculating maximum no of records that can be accomodated in the internal table.
3. Fetching that many records at a time and performing desired operation.
parameters : P_table type DDOBJNAME.
DATA : db_cursor TYPE cursor.
DATA : lt_dfies TYPE TABLE OF dfies,
ls_dfies TYPE dfies,
struc_size TYPE i VALUE 0,
g_package_size TYPE i.
FIELD-SYMBOLS : <xtab> TYPE any table.
data : l_refitab type ref to data.
create data l_refitab type table of (p_table).
assign l_refitab->* to <xtab>.
get nametab
CALL FUNCTION 'DDIF_NAMETAB_GET'
EXPORTING
tabname = p_table
TABLES
dfies_tab = lt_dfies
EXCEPTIONS
not_found = 1
OTHERS = 2.
IF sy-subrc <> 0.
MESSAGE ID SY-MSGID TYPE SY-MSGTY NUMBER SY-MSGNO
WITH SY-MSGV1 SY-MSGV2 SY-MSGV3 SY-MSGV4.
EXIT.
ENDIF.
Logic for calculating Package size
To calculate the memory taken by one record
LOOP AT lt_dfies INTO ls_dfies.
struc_size = struc_size + ls_dfies-leng.
ENDLOOP.
To calculaten maximum no of records that can be accomodated in 2gb
g_package_size = 2147483648 / struc_size.
OPEN CURSOR WITH HOLD db_cursor FOR
SELECT * FROM (P_table)
BYPASSING BUFFER.
DO.
To Fetch data in chunks of 2gb
FETCH NEXT CURSOR db_cursor
INTO CORRESPONDING FIELDS OF TABLE <xtab>
PACKAGE SIZE g_package_size.
IF sy-subrc NE 0.
CLOSE CURSOR db_cursor.
EXIT.
ENDIF.
Here do the operation you want on internal table <xtab_buf>
ENDDO.
This way ensures that you get data in chunks of 2GB and thus avoids short dump. As we are dynamically calculating the maximum no of records for 2GB, we are reducing the database hits to the minimum. It works for small tables as well.
‎2008 Apr 15 6:57 PM
Also Mick ,
I can c a lot of s_something in u r code.
i assume it is select option.
r u giving these selection criteria on u r selection screen before executing the report?
Giving all of this would really help as it would fasten u r report