on 2015 Jun 18 3:37 PM
Hello,
We have a workflow that starts whenever a material change. The workflow then finds all active PO's that are involved with this material and using the dynamic parallel processing feature, submits a subworkflow for each PO, to update it with the material changes.
The issue I'm having is that once the PO's reach 1000, workflow goes into error saying that it can only handle 999 parallel instances. Is this a "set in stone" feature? What can I do to process more than 999 instances?
Thanks!
Dan Stoicof
Request clarification before answering.
Thank you all for your responses. I understand your position, to redesign the workflow.
Let me give you my take on this: This workflow has no approvers, it's just meant to automate updating of PO's (and PR's) when the material changes. I see no harm starting a dynamic parallel process for as many items as there are found. The reason is that the volume is managed by the Event Queue. This has been working just fine, with no issues what-so-ever. That was until the workflow stopped because it reached the "limit" of 999 items (probably there is a 3 digit field defined somewhere).
I welcome your comments/suggestions.
Regards,
Dan Stoicof
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
> I see no harm starting a dynamic parallel process for as many items as there are found.
Even after you found out that it doesn't work?
>The reason is that the volume is managed by the Event Queue.
How? Your workflow gets started by an event, right? That part is managed by the event queue. But once the workflow has started, I don't see how the event queue would be managing anything?
I would definitely not use parallel processing in your scenario. Normally when you want to use parallel processing, there should be a reason for that (something needs to be done in parallel - for example an invoice should be approved simultaneously by several agents etc.). Why should the POs get updated in parallel? Why it cannot be done sequentially (which actually happens already even when you are using the parallel processing).
I am not actually sure if I would use the workflow at all - or at least not create individual work items for each and every PO/PR. Seems like unnecessary overhead for me to use workflow to handle this kind of mass update scenario.
Kind regards,
Karri
Karri,
To answer your question on how is the Event Queue used in this case . . . The main workflow is started by a "material changed" event. This starts a process that creates a list of all the PO's and PR's affected. Using these 2 lists we start a subworkflow that basically gets invoked for each PO and/or PR found. These subworkflows are registered with the Event Queue. Using the Event Queue allows releasing of (currently) 50 workflows every 2 minutes. This feature also allows for monitoring and ease of reprocessing in case of errors. Who says that Dynamic Parallel processing is to be mainly used for approvals? I can use Dynamic Parallel processing to speed up a process, which was my goal in this case. Imagine doing 20 updates in parallel rather than in sequential mode, then multiply by several hundreds.
I welcome your suggestions, but in retrospect, I don't see how your solution (to process each item sequentially, outside the workflow) would be any better (how would you know which processes failed, how do you know if there are work processes out there available before releasing the next update, etc.). We did not have a single error until recently (so the volume was below the max limit of 999), and I can certainly modify the workflow logic/process to break down the list into manageable chunks.
Again, I thank you for your suggestions.
Regards,
Dan
Hi Dan,
your problem is not a one, which should be solved by a Workflow at all.
Do your updates directly in the receiver-type function module instead, replacing the SWW_WI_CREATE_VIA_EVENT_IBF with a copy of that one, holding your own code that updates the PO directly. No Workflow.
Best wishes
Florin
Hello
If your are concerned about error handling during update failure, then you can create a custom application log to log errors and send a email (with attached application log details) to Distribution list for action. You can also include the email step/decision step for error handling in the workflow
I can certainly modify the workflow logic/process to break down the list into manageable chunks.
I think it is good idea to package the PO's and PR's related to material into groups for processing.
Regards
Sandy
Hi Dan,
>Who says that Dynamic Parallel processing is to be mainly used for approvals?
Well, I didn't. It was just an example. I have also used dynamic parallel processing for background steps/processes, but for a very limited amount of parallel "cases".
Basically I see only two ways out of this problem:
1) The "manageable chunks" option. (Now I understand your usage of the event queue, and in certain way it makes much sense.)
2) Redesigning your solution. I certainly would not use the workflow - or at least not in the way you are using it. You will run into these internal limitations of the WF engine (from my point of view WF was not designed to be some kind of mass processor of things - at least when it involves creating lots of work items under one parent work item). Sandy gave you some great hints about the error handling part, if that is one of your concerns.
One last question: Is it so that there could for example hundreds of material changed at the same time (by some background job, interface or whatever), and then each of these could cause a change to lots (hundreds) of PO/PRs. If this is the scenario, then this really gets interesting, and there is a great need for careful control of things...
Kind regards,
Karri
| User | Count |
|---|---|
| 13 | |
| 8 | |
| 7 | |
| 5 | |
| 4 | |
| 3 | |
| 2 | |
| 2 | |
| 2 | |
| 2 |
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.