Application Development and Automation Discussions
Join the discussions or start your own on all things application development, including tools and APIs, programming models, and keeping your skills sharp.
cancel
Showing results for 
Search instead for 
Did you mean: 
Read only

Native SQL - data select performance issue

former_member199351
Active Participant
0 Likes
2,682

Hello,

The logic shown below works fine. However, when selecting large number of rows there is a performance issue. I believe it is caused by the fact that every line is appended to internal table within SAVE_LINE subroutine one at a time. Is there any way to build native sql where the data is selected and saved into internal table as a snap shot (equivalent to "select * into table..."), rather then one line at a time?

Thanks,

Alex

TRY.

EXEC SQL PERFORMING save_line.

SELECT

"ACTION_ITEMS"."ACTION_ITEM_ID",

"ACTION_ITEMS"."PTY_ID",

"ACTION_ITEMS"."ACTION_ITEM_DESC"

INTO :ITAB-ACTION_ITEM_ID,

:ITAB-PTY_ID,

:ITAB-ACTION_ITEM_DESC

FROM "ACTION_ITEMS"

WHERE "ACTION_ITEMS"."ASSIGN_EMP_PTY_ID"

= :PARTY_ID

ENDEXEC.

CATCH cx_sy_native_sql_error INTO errclas.

err_text = errclas->get_text( ).

ENDTRY.

.....

FORM save_line.

APPEND ITAB.

ENDFORM.

1 ACCEPTED SOLUTION
Read only

former_member199351
Active Participant
0 Likes
2,232

Rob,

Thanks a lot for the sample. It did perform better then my code. But it also appended 1 line at a time. Is there any way to avoid appending 1 line at a time?

Regards,

Alex

Edited by: Alex Trosman on Feb 19, 2008 5:00 PM

14 REPLIES 14
Read only

Former Member
0 Likes
2,232

You might try opening a CURSOR and using FETCH NEXT.

Rob

Read only

former_member199351
Active Participant
0 Likes
2,232

Hi Rob,

Would you have any examples on how to do it?

Thanks,

Alex

Read only

0 Likes
2,232

This is something you can try, but I'm really not sure if it'll be any faster:

TABLES payr.

DATA:
wa_src LIKE payr,
i_src LIKE TABLE OF payr,
l_str1(20) TYPE c.


PARAMETERS: l_string(20) TYPE c.

START-OF-SELECTION.
  CONCATENATE '%' l_string '%' INTO l_str1.

  EXEC SQL.
    OPEN C FOR
    SELECT *
    FROM  PAYR
    WHERE UPPER(ZANRE)  LIKE  :L_STR1
    OR    UPPER(ZNME1)  LIKE  :L_STR1
    OR    UPPER(ZNME2)  LIKE  :L_STR1
    OR    UPPER(ZNME3)  LIKE  :L_STR1
    OR    UPPER(ZNME4)  LIKE  :L_STR1
  ENDEXEC.

  DO.
    EXEC SQL.
      FETCH NEXT C INTO :wa_SRC
    ENDEXEC.
    IF sy-subrc <> 0.
      EXIT.
    ENDIF.
    APPEND wa_src TO i_src.
  ENDDO.

  EXEC SQL.
    CLOSE C
  ENDEXEC.

  LOOP AT i_src INTO wa_src.
    WRITE: /001 wa_src-zanre,
                wa_src-znme1,
                wa_src-znme2,
                wa_src-znme3,
                wa_src-znme4.
  ENDLOOP.

Rob

Read only

ThomasZloch
Active Contributor
0 Likes
2,232

just curious, what is your motivation to use native SQL rather than open SQL?

Cheers

Thomas

Read only

former_member199351
Active Participant
0 Likes
2,232

I am using native sql becuase I need to get data from external oracle database within ABAP. This is the only way that I know how to do it. I can also run store procedure, but it creates similar performance issue.

if you know of any other way please let me know.

Regards,

Alex

Read only

0 Likes
2,232

Native SQL has performance issues. Try your way a number of times and mine a number of times. Use whichever takes the least.

Rob

Read only

0 Likes
2,232

>get data from external oracle database within ABAP

I see. Well, see Rob's comments about the limitations of native SQL, you probably have to live with that.

Cheers

Thomas

Read only

former_member199351
Active Participant
0 Likes
2,233

Rob,

Thanks a lot for the sample. It did perform better then my code. But it also appended 1 line at a time. Is there any way to avoid appending 1 line at a time?

Regards,

Alex

Edited by: Alex Trosman on Feb 19, 2008 5:00 PM

Read only

0 Likes
2,232

Not that I know of - that's why we use ABAP

Rob

Read only

former_member199351
Active Participant
0 Likes
2,232

Thanks for the help guys. I will use fetch since it is a bit faster, and we'll have to live with performance.

Read only

0 Likes
2,232

OK - but be sure to run it a number of times in an environment where there's lots of data. Buffering may skew the analysis.

Rob

Read only

Former Member
0 Likes
2,232

Hi all,

would be interesting to know how SAP implemented it's

SELECT INTO Internal_table Statement.

At least they must go to the database and do what?

a) They can open a cursor and fetch "behind the doors" into the table. You don't need a line-by-line approach if you have nothing to process row-by-row.

b) They can trigger a bulk read into the database and use an array fetch approach. That will speed up things (maybe) and have to stuff the array contents into the internal_table. But that will be at least some kind of package processing. A bulk read uses a certain array package size (let's say 100 entries per one fetch). If you would implement a kind of unlimited array size you would run into memory problems.

An I bet SAP hates to run into that problem. See how they convert big tables in SE14...

Read only

0 Likes
2,232

Hi,

I too am having performance issues. I am reading 250K records and its a very complex query. I would like to execute a procedure instead. Something like

 
open c for 
  exec z_table sy-mandt p_kschl

how do I pass the variables? I am getting error when I pass the variables.

If I create a view in the backend, and use select then it again takes a lot of time. The approach I am adopting in the procedure is significantly reducing my database access time. Any suggestions?

Warm Regards,

Abdullah

Read only

0 Likes
2,232

check the SAP Demo ADBC_DEMO . It will help you get resultset of your query into your internal tables using standard classes.