on 2008 Jul 17 8:55 AM
Hi all,
i have questions regarding the usage of big files with a BPM scenario. The functional requirements are as follows:
1. Pick up large (raw data) file (1MB) from FTP server
2. Drop this file to a second FTP server
3. After the file has been transmitted sucessfully (criticall !), look into a DB and extract information with the help of the filename of the transmitted file and extend the message (in a message mapping)
4. Send this data to ECC and update a custom table
My approach to realise this scenario, would be to perform the DB lookup in a java mapping. Are there any other options?
In addition I am concerned about performance issues, because we will send about 200 files a day (up to 10 at a time) using that interface. Is there a possibility to avoid the integration process?
Kind regards and thanks in advance
Florian
Hi Floarin,
Your requirment can be accomplished with and without BPM.
With BPM:
To improve performance:
you can use the Concepet of Message Packaging For BPE which is best suited for requirment of your kind i.e where you have multiple files coming in in bunch.
you can also define receive of a file on a FTP under a block only as you are not using file content and you need file name only.
Please use JDBC look up only it will improve the Performance to a greate extend.
After JDBC look up you can use that Data to write into ECC
Without BPM:
You can find Modules that can extract file name and pass that file name to JDBC look up and then output of JDBC to ECC.
Even here also you can use Concept of Message Packaging.
To handle Exception you can use Alert
Reward Points if Helpful
Thanks
Sunil Singh
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
Break it into 2 interface.
File to File and JDBC to R3
Thanks
Farooq.
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
Thanks for your replies:
@Farooq
When i break into into two interfaces, how can i make sure, that when the DB interface is only executed after the file has been sucessfully transmitted?
Regarding the database:
The database contains additional information regarding that file. The file is just a binary and SAP need the meta data for further processing. So what i want to do is to transmitt the file and after this has finished, i look into the DB (the key for the record i want to retrieve is the filename) and pick up the meta data and send it to SAP. So in the end my state ist:
If the record has been transmitted sucessfully I want to be sure the the file with the raw data has been sucessfully stored on the other server.
would be to perform the DB lookup in a java mapping. Are there any other options?
U may also use sync send step to fetch the details from database. This step should be used after the async send step of the file receiver.
Is there a possibility to avoid the integration process?
If u want to handle all specified things in a single scenario, then u cannot avoid BPM. However, if u could break the same scenario into two parts, it could be helpful. e.g. First perform file to file transfer asynchronously. Then u may use this new incoming file for 2nd async scenario wherein u may extract the name of file dynamically and proceed further.
Regards,
Prateek
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
User | Count |
---|---|
69 | |
11 | |
10 | |
10 | |
9 | |
9 | |
6 | |
6 | |
5 | |
4 |
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.