on 2014 May 08 4:20 PM
We have a situation where we have a job with multiple workflows, each with multiple dataflows. When the job ran we received an error on one of the dataflows.
However the job itself only shows a warning. The HDBODBC is a known warning that we ignore so our Ops team did not pick up that this job had failed.
Subsequently the steps after the failed step did not run and the monitor log shows it died in the middle of the failed step.
This is how the workflow is set up and it was the first dataflow that failed.
Long story short is this a known bug that is fixed with subsequent versions (we are on 4.0). Is there a way to work around this so that the rest of the job can continue?
Thanks,
Ken
Request clarification before answering.
Hi Ken,
after seeing your first screenshot i can understand
1 - you are using HANA database
2-there is one field "WBS_Text" whose length is less than the value length coming from source.
3- You are creating Sub dataflows( using run as as seperate process or data transfer transform).
suggestion :
1-Can you please check that your DSN created for HANA database is working fine. as it says HDODBC is unknown.
2- PLease increase the field length for WBS_TEXT.
3 - For the first run don't create any sub_data flow.
Please try and update the thread.
Regards,
Shiva Sahu
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
See my comments....
1-Can you please check that your DSN created for HANA database is working fine. as it says HDODBC is unknown. (KEG - My understanding is this is a known bug with 4.0 and HANA that will be fixed in subsequent versions. We plan to upgrade in the summer).
2- PLease increase the field length for WBS_TEXT. (KEG - I believe this is also a known issue where SAP ERP and HANA have datatypes of NVARCHAR but data services does not. Data services treats the nvarchar field (because if unicode one byte is represented as two) as varchar and it causes the error. This again will be fixed with an upgrade).
3 - For the first run don't create any sub_data flow. (KEG - I will look into this but I have another case where I have a job with a workflow and three dataflows. First dataflow is an RFC call back to an SAP function, 2nd dataflow processes the data, 3rd dataflow is another RFC call. No sub dataflows. In this case we get an error in the 2nd dataflow, the log shows the step marked with "proceed", but the job itself is marked Green as if it completed successfully. The 1st and 3rd dataflows show "stop" in the log).
The known issues aside in 1 and 2 above my concern is more about how data services seems to be operating when there are errors. Sometimes running additional dataflows, sometimes not. Marking the job with a warning or as successful even when there are errors. I am trying to understand if that is a setup issue with how the jobs are constructed or a known bug with 4.0.
User | Count |
---|---|
76 | |
30 | |
10 | |
8 | |
8 | |
7 | |
7 | |
5 | |
4 | |
4 |
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.