cancel
Showing results for 
Search instead for 
Did you mean: 

PSA error records update -- long time

Former Member
0 Kudos
426

Hello BW Experts,

I am loading master data for a custom infobject IOB1.

It has a around 150 fields.

infopackage has 1 error tolerate set

I have loaded 1.5 Million records.

1 records has bad data.

monitor > details > only one package with red other green

changed the record in psa

pressed the further update button > pop-up: incorrect records are updating.

but it took almost the same time ( 7 hrs )as the intial load with 1.5 M records.

-- 1) wondering you know the reason it is taking such long time.

-- 2) what is the process that takes place with the error records are updated for the master data from PSA.

--3) Any suggestions to cut down this time ?

Suggestions appreciated.

Thanks

BWer

Accepted Solutions (1)

Accepted Solutions (1)

edwin_harpino
Active Contributor
0 Kudos

hi BWer,

try RSCUSTV6, decrease/increase,

to improve the performance.

to error handle, you can try the 'error' setting infopackage.

hope this helps.

Former Member
0 Kudos

AHP,

package size set to 50,000

and partition size set to 1,000,000

is that okay which one to decrease ?

Regards,

BWer

edwin_harpino
Active Contributor
0 Kudos

hi BWer,

seems ok.

Number of Data Records per Package

This option refers to the number of data records that are delivered with every upload from a flat file within a packet. The basic setting should be between 5000 and 20000 depending on how many data records you want to load.

Activities

If you want to upload a large quantity of transaction data, change the 'number of data records per packet' from the default value of 1000 to between 10000 (Informix) and 50000 (Oracle, MS SQL Server).

Dependencies

The data packets can be imported in parallel in BW in the background. With an improved use of system resources (uploading is divided up into several work processes), uploading in parallel is the most cost-effective from a performance point of view.

Recommendation

You should not divide up the quantity of data into too large a number of packets since this reduces performance when uploading. The number of data packets should not be more than 100 per loading process.

Size of PSA partition

You can determine here after how many records you want to create a new partition. By default, this value is set to 1,000,000 records

hope this helps.

Answers (2)

Answers (2)

Former Member
0 Kudos

Hello BWer,

You can follow what Rohit is saying but as you did not apply the error handling button initially now you cannot apply it - please correct me if i am wrong. So what u could do is - you could load the data from PSA once you correct the record. you can follow the steps

1.In the data target, turn the request red and delete it.

2.Go to the PSA and correct the record(s) that have a red status. The edited PSA should have all records with either green (correct) or yellow (edited manually) status. SAVE the PSA. (If you did not delete the package from the data target, you won’t be able to edit the PSA)

3.Once the PSA is correct, turn the request green. The request will continue running as if there had been no errors.

4.The request will turn itself yellow now. To monitor the reload, go to the tab “Details” and open the step “Processing (data packet)”. The request will be green in the steps Update PSA and Transfer rules because the data is already in the PSA. The step that is missing and will be processed is the one that takes the data from the PSA through the Update Rules onto the data target.

5.When the request finished, the detailed log will show green.

Hope this helps you!

Liza.

Former Member
0 Kudos

Liza,

If I have to load the errored records there are two options rite

1) further processing button

2) turn the request green ( as you suggested ). I have never tried this option. But i am wondering by turning it green, will it start the process automatically.

Thanks,

BWer

Former Member
0 Kudos

Hi BWer,

You should turn the request to green and go to PSA right click and say update data targets. This will start the request to run again.

Bye,

Liza.

Former Member
0 Kudos

Hi BWer,

If you have set error tolerate in infopackage then on error system should create a new request with that record and you can use that request, it will load fast as it will have only one record in it.

Regards,

Rohit

Former Member
0 Kudos

Rohit,

our scenario is that, we have couple transaction loads which do a lookup on the master data. so we usually keep the error tolerance to 1 so that it comes to the attention of the developer / bw person. So our load fails after 1 error in each package. So I correct them in PSA and load them again. This loads takes a long time. How do you guys handle such scenario....

what are your tolerance settings and how do you handle very sensitive / important master data daily loads ?

Thanks,

BWer

Former Member
0 Kudos

Hi BWer,

In the error handling option if you select second or third option i.e. Valid Record Update... then system put all the error records in different request which you can trigger after rectifying the error.

Do you have any condition which says that records should go in the sequence they were received? If not then you can try this.

Regards,

Rohit