cancel
Showing results for 
Search instead for 
Did you mean: 

Duplicate data record detected when update to DSO....

Former Member
0 Kudos
1,184

I have to do one requirement to calculate the Time in job based on the Employee, position, job role and job stage. but here employee and cal month are the direct source fields, remaining fields are looking up in to Two other DSO's. So I wrote the code at End routine, the expected result will be the one employee might be more than one position, job role and job stage,  but I got an error when i am executing the DTP that is -  duplicate data record detected when updating to the DSO. actually there is no duplicate records with Key field combinations that is employee,position, Job role, Job stage and cal month.

I found only Employee and Cal month are the semantic keys at semantic group, I did not found any more keys which i mentioned in the DSO. key fields.

and I have tried with start routine also, same error facing , even i did not find any more keys at the semantic group,

one more surprised thing is I have opened the new data table at se11 and i checked the 5 key fields at new data table level, then it will be working fine,

and i got expected records. same thing i am not able to follow in production system with fire fighter access also.

any body please help me out on this.

Accepted Solutions (1)

Accepted Solutions (1)

timkorba
Participant
0 Kudos

Dommaraju,

Can you provide a little bit further information on what you are trying to do.  It seems that your DSO key that you are writing to is incorrect.  Are you trying to create additional records or just add values to fields in the same record?  Let me know and I will be able to provide additional help.

Thanks.

Former Member
0 Kudos

Hi,

I am trying to create Additional records for same Employee,but there is no duplication records for the key combination which i mentioned at DSO, Please find the above posted screen shots what I am doing exactly doing....

Thanks.

anshu_lilhori
Active Contributor
0 Kudos

Hi,

While activation you must be getting the error because your record count should be unique which in your case is not.

You need to modify the code by adding few lines which increase the record count--This can be done with the help of  RECORD field available in the structure of End routine.

Regards,

AL

Former Member
0 Kudos

Hi Anshu,

I am trying to append the records, while activation i am not getting any error, at the time of loading  to new data table only facing error. please find the above screen shots for my clarity.

thanks for reply.

anshu_lilhori
Active Contributor
0 Kudos

At first place why are you changing the key fields of new data table--you are not supposed to do that.

Just because you have given the developer authorization that does not mean you need to change the keyfields

It is pretty evident that when you select other objects as part of keyfields then your record becomes unique else you can see that based on the three Keyfields of new data table-SID--Datapakid--Record

Your first two records are same.So obviously as i suggested in my last reply you need to increase the record count for extra records which you are appending to the table.

This will solve the issue.

Regards,

AL

Former Member
0 Kudos

Hi Anshu,

You are telling correct, but for testing purpose i checked the key fields in new data table.

You are correct, The record Id should be unique, then only data will load to table,

Now I am implementing this in End routine, once it will be working fine, come back to you.

Can you please share the code for unique record, if you have.

Thanks.

Former Member
0 Kudos

Hi Anshu,

The data load has been successfully loaded to DSO in production, I have implemented the small piece of code which increase the record count.

Thank you very much.

Answers (3)

Answers (3)

Former Member
0 Kudos

These kind of issue will occur when we have time dependent master data. So could you please check the data available in data target before hand when you are updating data from below layer.

It would solve your problem.

former_member183012
Active Contributor
0 Kudos

Hi Dommaraju,

I would suggest can you please check your DSO key combination once again and comapre with source data.

Can you please check in SE11 for keys for new and active table for your DSO

/bic/Adsoname00

/bic/Adsoname40.

I think you need to modify you DSO keys according to your requirement.

regards,

Ganesh Bothe

Former Member
0 Kudos

Hi Ganesh,

Actually from the source only Two fields data is coming (Employee Cal month and source system) The error is showing when the records are loading to New data table, not to Active data table, I am able to see the keys and it is default checked in active data table at se11, but not in New data table, Can you please see the above post...

Thanks.

0 Kudos

Hi,

In New table you will not have DSO key fields checked in SE11. In New table you will have always SID, Data packet Id and Record as key field.

In Active table only you will have same key fields in SE11 and in DSO.

I think your APPEND coding have some issue. May be share your coding so that we might help you better.

Regards,

Poomagal S.

KodandaPani_KV
Active Contributor
0 Kudos

Hi Dommarajau,

please check the one more time your DSO key fields combination.

DSO will work based on key fields transfer the data records.

if your getting any issue please send to us error screen shots.

meanwhile read the below document

DSO overwrite and summation function.

Thanks,

Phani.

Former Member
0 Kudos

Hi Kodandha pani,

  I have checked key fields combination in DSO. please see the below screen shot.

Error screen shot

when i  check the key fields at the se11 for new data table

Then it is working fine, see the result screen shot below, after check the keys, the records are in new data table

anshu_lilhori
Active Contributor
0 Kudos

Hope you are not trying to append the records in code..If so then you need to modify the code.

Also please share the code for our analysis on the same.

Regards,
AL

former_member185132
Active Contributor
0 Kudos

Hi Dommaraju,

In your error msg it says "Records filtered in advance as error records with same key already exist"

So what's happened here is this: at some time in the past (not necessarily in current load) there were some records with the same semantic key that were marked as error due to some issue. Maybe they had invalid characters or some other problem with the record. Anyway, the important bit is that because they were marked as error records, if a record with the same key comes in a future DTP request (like the one you're running now) you will get this error.

The way to fix it is to open up the Error DTP and identify the records belonging to the same sem. key. Either delete those records from the Error DTP or fix them. Then execute the error DTP followed by the regular DTP.


Regards,

Suhas

Former Member
0 Kudos

Hi Suhas,

Thanks for ur reply,

Before they do not use the error stack, my self i enabled the error stack for knowing purpose which records are storing in the error stack.The records are correct which are stored in error stack.