Application Development and Automation Discussions
Join the discussions or start your own on all things application development, including tools and APIs, programming models, and keeping your skills sharp.
cancel
Showing results for 
Search instead for 
Did you mean: 
Read only

Program dumps for larger files - SAPSQL_ARRAY_INSERT_DUPREC

Former Member
0 Likes
1,502

Hi Experts,

I'm executing a program in background to fetch data from file and then inserts records to four Z tables. When I run this program with small files, all the records were inserted successfully. But when I try to run with larger files (almost 80mb), I'm getting runtime error.

I checked in all the tables for duplicate entries, but couldn't find any duplicate keys.

I have checked related scn posts also. I'm not getting why it is not working for larger files.

Thank you for any help !


Veena

10 REPLIES 10
Read only

SimoneMilesi
Active Contributor
0 Likes
1,439

i do not think it's a matter of file size.

The dump is clear: insert with already existing key.

You can put a TRY..CATCH on insert and write on output the record that fails.

Or, before INSERT statemente, you can do a SELECT COUNT for the keys and, if sy-dbcnt >1,

write on output the record and skip the insert.

Read only

0 Likes
1,439

Thanks for your quick reply.

When i try the same file with less number of records, it is inserting successfully. Why it is happening so ? Any idea ?

Read only

0 Likes
1,439

Hello Veena,

It is clear that, either you have duplicate records in the file which you are uploading or duplicate records already exists in the DB with same primary keys.

I think when you upload file with lesser number of records, you would have removed the record which actually caused the problem. So the insert operation succeeds.


Try whet Simone has suggested and find out the records which causes the issue.

Regards,

Read only

0 Likes
1,439

I think its just by chance that it works with smaller files, goto st22 open the dump n hit long text here you'll have some program variable dumps analyze to see if u get any hints on the values or which ztable is causing the issue

Read only

0 Likes
1,439

As i told, maybe with huge number of record, you got some duplicate record you do not recognize.

In addition to my previous reply, add "commit work" after the insert so you check if you get the duplicates in the file.

ATTENTION!

Adding Commit work for each insert will kill performances! i suggest you to use it just in development to check the file and then to implement some check logic like


SORT tb_file BY <keys of tables>

LOOP AT tb_file.

     AT NEW key

          CLEAR lv_counter.

     ENDAT.

     ADD 1 to lv_counter.

     AT END OF key

          IF lv_counter > 1.

               duplicate records key

          ENDIF.

     ENDAT:

ENDLOOP.

Read only

former_member202771
Contributor
0 Likes
1,439

Hi Veena,

this dump comes when we are trying to insert duplicate records with same primary key.

Thanks,

Anil

Read only

0 Likes
1,439

Hi Anil,

I checked in tables for the key. Those keys doesn't exist in my Z tables. And it is working when I try with smaller files.

Read only

0 Likes
1,439

in your larger file, try sorting it by the key you are wanting after uploading it - and check entries in debugger.

it could also be a bit of program logic is incorrect - for example, say the file is correct, you may actually trying to create the record incorrectly if your logic is not assigning part of the key properly.

I find 9 times out of 10 the problem may well be because you have a couple of 'rogue records/permutations not yet seen' - and you may find it is dropping through any logic you have and you are in fact creating a duplicate key with a blank or part blank if there is more than one part of the key!! Check this out.

try a test online and not in background - straight after the short dump enter the debugger and check the variables at the create point.

Work through the above - I am sure you will find the problem, my bet is permutation in input file or error data lines in there not seen before and you are creating the entry with a part/whole incorrect key due to that. Probably a blank/null key and hence a duplicate will occur.

Read only

Former Member
0 Likes
1,439

As your error is related to duplicate entries, it can occur in 2 cases...

1. DB has entry for the same primary key .

2. Internal table which is used for insertion might have duplicate entries...

If you are using internal table for inserting entries into table (single insert statement with internal table), you can solve this in any of 2 ways..

1. Use Sort the records in the internal table by the key fields of the table. Then, use "DELETE ADJACENT DUPLICATES FROM <itab> COMPARARING <key fields list>" for deleting the duplicate entries ....This will be helpful if DB table does not have existing entries for the same keys in the internal table..

2. use MODIFY statement for updating the database table instead of using INSERT statement. This will keep to have duplicate entries in the internal table. But the last duplicate key result will be updated in the DB table..


Read only

Former Member
0 Likes
1,439

Problem solved.

Issue was with the internal table which had multiple entries with same key.

Thanks to all.