‎2010 Sep 14 6:52 AM
Hello everyone,
I am processing a file from application server and posting material documents in SAP against a unique number in legacy. This is a backgroung job. The same program can be executed in foreground also.
My problem is with reprocessing the same input file or accidental processing of the same file again and again.
Option 1:
While posting the material document, i pass the legacy invoice number in MKPF-BKTXT using the same BAPI. For each processing i check the BKTXT before posting against a invoice
Option 2:
Maintain separate table for posted invoices
Option 3:
Maintain log file in application server against input file and check that before posting.
For me option 1 seems to be the best method as for the other 2 options there is a chance that document is posted and while writing the log there is some termination unless i can combine and write the log in the same logical unit of work (LUW).
Please share you opinion...
Is there any better way to handle this ?
Thanks
‎2010 Sep 14 7:11 AM
‎2010 Sep 14 7:11 AM
‎2010 Sep 15 7:15 AM
Hi Prashant \ Rock,
Thanks. I am going with option 1 and planning to create secondary index on BKTXT field..Is there any negatives creating index on text field ??
@ Bruce
In my case the user is allowed to process the same file any number of times..only problem is that if the documents are posted for a legacy invoice ..it should not post again..So the input file will be corrected and processed till all postings are completed.
Thanks
‎2010 Sep 15 7:21 AM
Hi R V,
well no there is not really anegative side of declaring a secondary index. It´s just that it shouldnt be too much secondary indexes on that table. I´f i´m right up to 5 is bearable.
‎2010 Sep 14 8:11 AM
Hi,
For me also Option 1 is the best. Coz for second option you have to create a Z table and store the data which I think is not necessary and similarly for the third option you have to create a log file in the app server.
Possible performance problem with option 1 : Till you execute the program in Background there is no problem but when you execute the same in foreground and the volume of the data is high then that select will create performance issue.
Thanks & Regards,
Rock.
‎2010 Sep 14 7:13 PM
R V,
We use a 4th option.
We created a database with the following fields;
CHAR 15 Interface File Name, case sensitive
DATS 8 Date Assigned by Source System
TIMS 6 Time Assigned by Source System
DATS 8 Date added to this table
TIMS 60 Time added to this table
All interface files contain a header record with these 3 fields:
CHAR 15 Interface File Name, case sensitive
DATS 8 Date Assigned by Source System
TIMS 6 Time Assigned by Source System
We created a program that compares the input file header record name, date, and time, with the database table values. Duplicates send an email and stop the next program in the stream (program that creates the SAP transactions). If the input file is not a duplicate, a record is inserted into the database table and the next program is the job stream runs.
Bruce