‎2008 Feb 19 3:27 PM
Hello,
The logic shown below works fine. However, when selecting large number of rows there is a performance issue. I believe it is caused by the fact that every line is appended to internal table within SAVE_LINE subroutine one at a time. Is there any way to build native sql where the data is selected and saved into internal table as a snap shot (equivalent to "select * into table..."), rather then one line at a time?
Thanks,
Alex
TRY.
EXEC SQL PERFORMING save_line.
SELECT
"ACTION_ITEMS"."ACTION_ITEM_ID",
"ACTION_ITEMS"."PTY_ID",
"ACTION_ITEMS"."ACTION_ITEM_DESC"
INTO :ITAB-ACTION_ITEM_ID,
:ITAB-PTY_ID,
:ITAB-ACTION_ITEM_DESC
FROM "ACTION_ITEMS"
WHERE "ACTION_ITEMS"."ASSIGN_EMP_PTY_ID"
= :PARTY_ID
ENDEXEC.
CATCH cx_sy_native_sql_error INTO errclas.
err_text = errclas->get_text( ).
ENDTRY.
.....
FORM save_line.
APPEND ITAB.
ENDFORM.
‎2008 Feb 19 3:59 PM
Rob,
Thanks a lot for the sample. It did perform better then my code. But it also appended 1 line at a time. Is there any way to avoid appending 1 line at a time?
Regards,
Alex
Edited by: Alex Trosman on Feb 19, 2008 5:00 PM
‎2008 Feb 19 3:37 PM
‎2008 Feb 19 3:39 PM
Hi Rob,
Would you have any examples on how to do it?
Thanks,
Alex
‎2008 Feb 19 3:43 PM
This is something you can try, but I'm really not sure if it'll be any faster:
TABLES payr.
DATA:
wa_src LIKE payr,
i_src LIKE TABLE OF payr,
l_str1(20) TYPE c.
PARAMETERS: l_string(20) TYPE c.
START-OF-SELECTION.
CONCATENATE '%' l_string '%' INTO l_str1.
EXEC SQL.
OPEN C FOR
SELECT *
FROM PAYR
WHERE UPPER(ZANRE) LIKE :L_STR1
OR UPPER(ZNME1) LIKE :L_STR1
OR UPPER(ZNME2) LIKE :L_STR1
OR UPPER(ZNME3) LIKE :L_STR1
OR UPPER(ZNME4) LIKE :L_STR1
ENDEXEC.
DO.
EXEC SQL.
FETCH NEXT C INTO :wa_SRC
ENDEXEC.
IF sy-subrc <> 0.
EXIT.
ENDIF.
APPEND wa_src TO i_src.
ENDDO.
EXEC SQL.
CLOSE C
ENDEXEC.
LOOP AT i_src INTO wa_src.
WRITE: /001 wa_src-zanre,
wa_src-znme1,
wa_src-znme2,
wa_src-znme3,
wa_src-znme4.
ENDLOOP.Rob
‎2008 Feb 19 3:43 PM
just curious, what is your motivation to use native SQL rather than open SQL?
Cheers
Thomas
‎2008 Feb 19 3:56 PM
I am using native sql becuase I need to get data from external oracle database within ABAP. This is the only way that I know how to do it. I can also run store procedure, but it creates similar performance issue.
if you know of any other way please let me know.
Regards,
Alex
‎2008 Feb 19 4:00 PM
Native SQL has performance issues. Try your way a number of times and mine a number of times. Use whichever takes the least.
Rob
‎2008 Feb 19 4:04 PM
>get data from external oracle database within ABAP
I see. Well, see Rob's comments about the limitations of native SQL, you probably have to live with that.
Cheers
Thomas
‎2008 Feb 19 3:59 PM
Rob,
Thanks a lot for the sample. It did perform better then my code. But it also appended 1 line at a time. Is there any way to avoid appending 1 line at a time?
Regards,
Alex
Edited by: Alex Trosman on Feb 19, 2008 5:00 PM
‎2008 Feb 19 4:02 PM
‎2008 Feb 19 4:08 PM
Thanks for the help guys. I will use fetch since it is a bit faster, and we'll have to live with performance.
‎2008 Feb 19 4:12 PM
OK - but be sure to run it a number of times in an environment where there's lots of data. Buffering may skew the analysis.
Rob
‎2008 Mar 13 9:59 AM
Hi all,
would be interesting to know how SAP implemented it's
SELECT INTO Internal_table Statement.
At least they must go to the database and do what?
a) They can open a cursor and fetch "behind the doors" into the table. You don't need a line-by-line approach if you have nothing to process row-by-row.
b) They can trigger a bulk read into the database and use an array fetch approach. That will speed up things (maybe) and have to stuff the array contents into the internal_table. But that will be at least some kind of package processing. A bulk read uses a certain array package size (let's say 100 entries per one fetch). If you would implement a kind of unlimited array size you would run into memory problems.
An I bet SAP hates to run into that problem. See how they convert big tables in SE14...
‎2009 Jan 25 5:24 PM
Hi,
I too am having performance issues. I am reading 250K records and its a very complex query. I would like to execute a procedure instead. Something like
open c for
exec z_table sy-mandt p_kschl
how do I pass the variables? I am getting error when I pass the variables.
If I create a view in the backend, and use select then it again takes a lot of time. The approach I am adopting in the procedure is significantly reducing my database access time. Any suggestions?
Warm Regards,
Abdullah
‎2009 Mar 12 9:27 AM
check the SAP Demo ADBC_DEMO . It will help you get resultset of your query into your internal tables using standard classes.