cancel
Showing results for 
Search instead for 
Did you mean: 
Read only

BODS - On Error Retrieve Record Values

Former Member
0 Likes
1,690

Hello,

We are running BusinessObjects Data Services 4.1. I'm creating a new Job in Designer and I'm having an issue with capturing record information on an error using the Catch object. What the job does is loop through a source table in database A, performs a few simple transforms on a few of the records and then inserts/updates/deletes the corrosponding record in a destination table in database B. If the job encounters an error with a record in database A (it being a duplicate of a record in the table in database B) we would like to be able to mark the record that caused the error as failed by entering an F in the status column. I had hoped that I would be able to do this in the Catch object by capturing the record that caused the error and running an update statement against the table based on the ID column. However I do not know how to retrieve the record that caused the error within the Catch. In a Script that I can put into the Catch I have access to these functions:

Error_Number()

Error_Context()

Error_Message()

Error_Timestamp()

But none of these will really give me details on the specific record that caused the error. Is there a way to look at the record that caused the error and grab column values off of it?

For example: we are looping through the table which has 10 records with IDs 1 through 10. Record 5 (with ID 5) causes an error and the Catch object catches the error. In the Catch I'd like to find out which record causes the error so that I can grab the ID off of the record. Then I can update the status column in the table where record ID is 5 to the value 'F' and also send an email indicating that the record with ID 5 caused a failure in the job.

Thanks for any help that you can provide,

Riley

Accepted Solutions (1)

Accepted Solutions (1)

former_member187605
Active Contributor
0 Likes

Your requirements seem not 100% clear. Can I summarize as such?

  1. You want to replicate modifications from a table in database A to database B
  2. You want to flag all occurences of a record in A already matching a record in B

For #2, you can use a simple inner join. Include all the columns in the join condition. The target table will contain the full duplicates only.

For #1, i.e. to sync 2 tables, use the table_comparison transform that wil automatically generate the correct insert, update and delete commands on the target table.

Note: duplicate records (identifed in the previous flow) are ignored by table_comparison.

Former Member
0 Likes

Hi Dirk,

The duplicates was just an example of a possible error that we'd like to catch and then mark the bad record in the source as failed. There could be other issues with the record that cause a failure (data type missmatch, data truncation, perhaps a missing before or after record on an insert). So when a case like those happens from the Catch we just wanted to figure out what record caused the error to be raised and then mark it as failed so subsequent runs will skip and and we can investigate the cause.

You suggestion of a Table Comparison may work to help with the duplicates (but I'm not sure about other possible errors?). I will investigate that solution to see if it will work for us.

Thanks,

Riley

former_member187605
Active Contributor
0 Likes

It' s obvious your dataflow logic will depend on your requirements. Your best bet in most cases would be including a validation transform in your dataflow to remove invalid records from further processing.

But note that DS is not a procedural engine. Conceptually it deals with data in complete sets at a time. It does not "loop" through a table, you cannot break out of a dataflow, you cannot drag values from a record to the following ones... The only exception is the built-in function previous_row_value that returns the column value of the previous record (that function, by the way, may have a significantly negative impact on the performance of your process).

Former Member
0 Likes

Hi Dirk, Thanks for your response. You've helped me understand how the system works a little better. We may need to re-think our dataflow.

Riley

Answers (1)

Answers (1)

Former Member
0 Likes

Hi,

Please use validation Transform.

Regards,

Manoj

Former Member
0 Likes

We have a Validation Transform but it does not validate duplicate key errors for example. Is there a way for the Validation transform to somehow read the destination database and determine if an insert record will be a duplicate?

Former Member
0 Likes

HI,

Yes I think Please kindly explore it.

Regards,

Manoj

Former Member
0 Likes

Hi,

You could capture the error records in an over flow file. In the target table enable overflow file in options.

For capturing the duplicate records use gen_row_num_by_group() to capture the value which is more than one.

Arun