cancel
Showing results for 
Search instead for 
Did you mean: 

Handle duplicate records in DTP.

Former Member
0 Kudos

Hi,

we are encoutering issues with dupliactes while loading data from DSO to Material (contains 20 attributes, material is the key ). when ever there is a change in attribute, will come in delta load.

We have enabled "handle duplicate records check "but still load fails due to duplicate records keys.

settings in DTP are : no semantic groups enabled.

error handling : valid records update , reporting possible. handle duplicate records enabled.

processing mode : serially in the back ground.

help is appreciated.

Praveen.

Accepted Solutions (0)

Answers (4)

Answers (4)

Former Member
0 Kudos

This message was moderated.

former_member182470
Active Contributor
0 Kudos

Hi Praveen,

Basically "Semantic Group" contains the Key Fields which are used for Error Stack, so that you can handle.

Regards,

Suman

Former Member
0 Kudos

hi,

you will have to enable semantic group. in semantic group mention material as key.

The setting Handle duplicate record only handles duplicate in the same package but if the same material is extracted in another package then it throws duplicate record error. Enabling the semantic keys will make sure that the data is extracted based on the semantic keys i.e. all the materials records will be pulled in a single package and not distributed in multiple packages.

All same materials will be extarcted in single package and handles duplicate value check will filter the records.

regards,

Arvind.

Former Member
0 Kudos

thanks arvind,

Do i need to create a new dtp to enable this semantic groups?. or i can change the existing DTP ( which has already sent 4-5 requests to target ).

Praveen,

Former Member
0 Kudos

hi,

You can change the existing one only.

regards,

Arvind.

Former Member
0 Kudos

thanks.... do i need use valid records update and reporting possible for this ?

Former Member
0 Kudos

hi,

You can if you are expecting other data quality issues to come for other records.(such as lower chars,spaces,invalid chars etc)

regards,

Arvind.

Former Member
0 Kudos

Hi Praveen,

As pointed correctly by Arvind,we use Semantic Groups to specify how we want to build the data packages that are read from the source (DataSource or InfoProvider). To do this, define key fields. Data records that have the same key are combined in a single data package.We specify that key in semantic keys.

thanks.... do i need use valid records update and reporting possible for this ?

Choosing this option is totally independent of the duplicate record error issue.

DTP supports in handling data records with errors. At runtime, the incorrect data records are sorted and can be written to an error stack . After the error has been resolved, one can further update data to the target from the error stack. Error may arise because of invalid character posted while entering postings in ecc side tables or if any particular char is not allowed in RSKC. So rather than going back to PSA editing the data there and reloading it back we have the option of error satck where we can rectify these records

Below steps would help us understand the basic idea of the Error Stack and how it would be handled to rectify the erroneous records.

No Update, No Reporting: Error Occurs, the whole data package is terminated.

Valid Records Updated, No Reporting (Request Red): Valid records updated. After manual release of request, data is available for reporting.

Valid Records Updated, Reporting Possible (Request Green): Valid records are updated and available for reporting.

Regards

Raj Rai

Former Member
0 Kudos

Hi,

Refer thread may help

[;

Thanks