on 2021 Nov 15 3:21 PM
Dear Experts:
As part of our new project we have to make both Online and Offline Odata calls to backend passing Deep Entities (Header and Position Lines), basically in the backend Odata Header Entityset Create method I will need to have access and process all Position lines associated with this Header.
From what I have investigated and understood (https://answers.sap.com/questions/13058754/mdk-odata-offline-create-entity-for-entity-with-na.htmL), there is limitations on using Deep Insert for Offline operations and options would be using Batch call (which I am not sure will resolve my problem) or use Create Related Entity action as it is described in Tutorial https://developers.sap.com/tutorials/cp-mobile-dev-kit-link-entity.html.
I have been doing some tests with my Trial account (our production platform hopefully will be activated this week) and my questions are:
As other backup workaround I am also looking of first saving/synching Position lines table in one separarate Offline Service Upload action and then when it is saved in the backend to synch Header table via other Offline Service Upload action, so it can access its Position lines in the backend processing.
Anyway, would really appareciate if anyone could share their experiencce, some examples or ideas for optimal solution for this case.
Many thanks,
Yergali
Request clarification before answering.
While working with MDK, we use $BATCH to make sure, deep insert is possible while using offline scenario. The only issue we have there is workflow. What we can do and this with nearly no effot is: we create an order with some information for Material and Operations. All there items should be created in a single step. With $BATCH we can simply loop over all entries and can make sure the items are filled correctly. As long as we did not create any other object, we can make updates to these items without issues.
It does get problemativ, if we have to create/update another object and after that we create another item that should be added to the above $batch. This will be an issue as it does create a new call. But because this would be the same issue in direct backend calls, I do not see that as an issue as such.
So in my understanding $batch would be the way to sort that issue.
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
Oliver,
Thank you very much for your comments!
I read through some resources online about Batch processing in MDK Offline including this post https://answers.sap.com/questions/13357105/correct-changeset-handling-in-mdk-app.html where my understanding the logic you have described in nutshell would be as following:
- In frontend MDK you group all your Create/Update actions within Single ChangeSet group of actions.
- Execute this ChangeSet in Offline store, then all Header and Items create actions will be in the same batch
- When Sync your Offline Store same batch will be sent in backend
- In backend in your Header entityset CHANGESET_BEGIN, CHANGESET_PROCESS and CHANGESET_END methods you receive all batch together as internal table and process through as needed.
Could you please confirm if I understood it correctly? Do you have some example of Frontend and Backend codes you implented for the steps above, mainly part related to Batch processing?
Many thanks,
Yergali
I personally have worked around this issue the same way as I was already doing it before MDK (in HAT based offline Fiori Apps). My first MDK project was actually an adaptation of one of our previous HAT applications so, since we were reusing most of the backend logic, we smoothly transitioned into it by making use of basically the same technique.
The main idea revolves around the concept of the "content ID". You wanna have a unique identifier associated to your header entity and to each of its child entities, depending on the complexity of your scenario you could simply store this ID in a property belonging to each one of these entities; Or you could go the extra mile and have a dedicated entity for this that you would use to let your backend know when to start a "transaction" and when to commit its changes.
Back in HAT times we did the latter so our deep insert operations started out by performing a create operation on our "Transaction" entity, followed by the creation of the desired parent entity and its child entities, and finishing with an update operation on the newly created Transaction object (this update would confirm this operation had to end and commit every change contained in that specific batch). For this we had a pivot table in our backend that we used to simply loop over and relate all of our entities with their corresponding transaction (linked by their Transaction ID contained in every single operation of the same batch). As for MDK... I haven't had any reason to actually use a dedicated entity for this and a simple TransactionID property has done the trick just fine. I used this method to create headers containing N amount of child and grandchild objects belonging to as much as 4 different entities without any issues whatsoever.
A simple javaScript timestamp object will serve you just fine as transaction ID since you are guaranteed for them to always be unique.
You can refer to this blog originally made by its author as a workaround for this exact issue but in HAT applications: https://blogs.sap.com/2016/02/15/introduction-odata-batch-processing-with-content-id-in-sap-gateway/
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
Francisco,
Thanks a lot for your advise and comments.
My experience with $batch calls is limited, since in prior SAPUI5 projects I was using Deep Entities and never had necesity till this MDK Offline.
I have read through various online resources on $batch processing, including the one you kindly included in your response and seems like backend part is more less clear, where will create entities, associations and then process my backend logic using redefined following methods:
But part is still not so clear is the Frontend MDK. How do you execute batch calls from there? What actions do you use (Create, CreateRelatedEntity) or do you do them via codes in Rules? Really appreciate if you could share some of your code examples without any confidential part.
Many thanks,
Yergali
Hi yergaliyertuganov,
It's a pleasure. As for MDK I haven't really needed to use batch processing for this particular scenario yet, the closest thing I can think of in MDK would be using changeSets but this doesn't particularly answer the issue with deep inserts in an offline setting.
What I do for deep inserts is simply create every entity with a CreateEntity action instead of using CreateRelatedEntity. In a normal scenario, be it while working online or while creating a child entity whose parent already exists in your database CreateRelatedEntity links your objects by using the @readLink property. For offline deep inserts I ignore the readlink property entirely since it's my "transactionID" property the responsible for maintaining the reference between parents and children in the backend.
This is an example where I create N amount of media objects attached to a Notification header that was previously created by the user in an offline setting:
let aCreateAttachments = [];
aCreateAttachments = attachmentObjects.map((attachment) => {
let content = attachment.content;
let fileName = attachment.fileName;
let oPromise = clientAPI.executeAction({
"Name": "/pmp_smart_pm/Actions/MyNotifications/MediaEntity_Create.action",
"Properties": {
"Properties": {
"BusObject": "BUS2038",
"Content": content,
"Filename": fileName,
"Mimetype": "image/png",
"TransactionID": "#ActionResults:CreatedNotification/data/TransactionID"
}
}
})
return oPromise
})
In this case I store the "TransactionID" value in the ActionResult object generated after a successful Notification Header creation (for more complex scenarios you could use ClientData for data persistence but in this case Action Result is more than enough).
I then retrieve every user attached file from an Attachment FormCell Control and map this array to create an array of promises, each promise containing a different Create Entity Action corresponding to each one of the depending media objects I need to create.
You can then use Promise.All() to handle the execution of all your nested created actions in one single execution:
return Promise.all(aCreateAttachments).then(result => {
return clientAPI.getPageProxy().executeAction("/pmp_smart_pm/Actions/MyNotifications/NotificationEntitySuccessMessage.action");
}).catch(error => {
return clientAPI.getPageProxy().executeAction({
"Name": "/pmp_smart_pm/Actions/Messages/GetMessage.action",
"Properties": {
"Message": error
}
})
})
Then you can modify your backend logic to use this TransactionID value to link your entities as you desire. Once the offline synchronization ends and you have successfully retrieved your new objects with their corresponding backend generated IDs, you can bind them together using the usual MDK methods as you would with any other object that wasn't created deeply in an offline scenario.
I hope this helps and I'm sorry if I wasn't more clear in the first reply. Feel free to hit me up with any additional questions.
Also FYI, this concept of executing MDK actions dynamically from javaScript rules is called Action Overriding and it's exceptionally useful for a huge amount of scenarios.
Francisco,
Well explained, thank you very much.
Actually I was also planning to use my own generated Unique ID and link Header and Child entitysets through additionally created Property in each where I will save this Unique ID. But my main challenge is, set of Header and its Child lines should be processed at the same call same time in backend (atomic - create all of them or none of them), meaning in the logic executed in backend (I assume in the Create_Entityset method of Header Entityset) it will check/process datas from both Header and its Children lines. So I will need to get all values in the same sort of Batch table or have Children items be saved before hand in its backend table (there is no any logic for them, just to save) and then call Header line backend which will filter data from the Children lines using that Unique ID.
If I am not mistaken, I understood from your codes/logic that Header entityset and its Child entitysets (media attachments) will be processed in backend separately, basically recorded in 2 different tables. I mean in your backend Header creation logic do you have any necesity checking its children values before this header is given green light to be created?
Thank you,
Yergali
User | Count |
---|---|
74 | |
30 | |
9 | |
7 | |
7 | |
6 | |
6 | |
4 | |
4 | |
4 |
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.