on ‎2016 Mar 15 11:32 AM
Hi
Can someone please help with my understanding of what exactly happens during a data upload?
We have a scheduled package that uploads data from a csv file via the Data Manager to the FACT tables regularily during the day.
The package contains amongst other things a BPC convert task followed by a Load and Process task. Immediately after these there is a logic script called which copies selected data to another BPC model. Finally there is a lite optimise of the models affected.
My question is what is the data source for the logic script at the moment it is run?
Is the data at that moment in time still in real time / writeback storage or has it already arrived in the FAC2 tables?
The reason I am asking is because we want to introduce a second scheduled load, also with scripts, which may co-incidentally run at the same time as the first and I want to avoid the risk that either of the scripts inadvertently select incorrect data from the other package for processing.
Trust i have explained this sufficiently
Thanks in advance
Alan
BPC 10 / SQL Server 11
Request clarification before answering.
Hi Alan
Based on the fact that you are running lite optimizations this would indicate your package is loading data into the WB table, the processing occurs based on the WB table.
The data arrives in the FAC2 table after performing the lite optimization.
I would suggest to run a SQL profiler trace and capture all the SQL statements when the package is run. You can run it with script logic and without script logic so you can see the difference.
In that way you will now exactly how your data is being loaded.
If you need to make sure logic is not run twice over the same data by two successive packages I would suggest to implement some control fields / tables and use them to make sure that package is run only after the previous one has finished.
Stefan
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
Thanks Stefan, this helps however are you able to confirm if having two independent scripts running simultaneously in the same model across shared data in the WB table could present a problem?
I'm thinking particularly of table or record locking with the risk of one or more packages failing (e.g. unable to write to the corresponding sgData table). I seem to remember we have had issues when a scheduled package is running and a user is manually processing a script at the same moment in time
Alan
| User | Count |
|---|---|
| 7 | |
| 6 | |
| 4 | |
| 2 | |
| 2 | |
| 2 | |
| 2 | |
| 1 | |
| 1 | |
| 1 |
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.