Technology Blogs by SAP
Learn how to extend and personalize SAP applications. Follow the SAP technology blog for insights into SAP BTP, ABAP, SAP Analytics Cloud, SAP HANA, and more.
Showing results for 
Search instead for 
Did you mean: 

One of the most exciting releases from the SAP Analytics Cloud planning would have to be the beautiful Data Action APIs for the Analytics Designer.

Beautiful - because with just a few lines of code, an Analytics Designer App developer can automate a chain of data actions in background, triggered by a single end user action. This constitutes a simpler and more direct end user workflow, cleaner application UI, and also eases the creation of the planning apps in which less effort can be spent on guiding users through the maze of data action triggers, and more on the business logic. The elegance this API brings to the UI, process, and app structure is something that is definitely going to push planning applications to the next frontier.

Now how does it look like exactly?

In the Analytics Designer you will now find a new technical object "Data Actions"

Once you have created a new object, you will be able to assign an existing Data Action to it via the builder panel, eliminating the need to place a Data Action trigger onto your canvas.

Once you have done that, the app becomes your playground with the following Data Action component methods:

Combined with the DataActionExecutionResponse component, you are able to do some awesome automation in the background, such as:

  • Triggering a Data Action implicitly

  • Using the status of a triggered Data Action to trigger the next in line

  • Setting the Data Action parameters automatically

Below is an example where you can trigger a Data Action (deriveDimension) implicitly, query the execution status and use it in turn to trigger the next Data Action in line (copyAllocatedValue), setting the parameter for it at the same time:
Application.showBusyIndicator("Preparing execution..."); 

var deriveDimResponse = deriveDimension.execute();
if (deriveDimResponse.status === DataActionExecutionResponseStatus.Success){
var copyAllocResponse = copyAllocatedValue.execute();
if (copyAllocResponse.status === DataActionExecutionResponseStatus.Success){
Application.showMessage(ApplicationMessageType.Success,"Execution successful.");
} else {
Application.showMessage(ApplicationMessageType.Error,"An Error has occured. Please try again.");

Below is another example where you can use the filter context of another widget and apply it to your data action, making it context sensitive:
var filters = table_1.getDataSource().getDimensionFilters("Account");
for (var i = 0; i < filters.length; i++) {
if (filters[i].type === FilterValueType.Single) {
var singleFilterValue = cast(Type.SingleFilterValue, filters[i]);
dataAction.setParameterValue(parameterId, [singleFilterValue.value]);
} else if (filters[i].type === FilterValueType.Multiple) {
var multiFilterValue = cast(Type.MultipleFilterValue, filters[i]);
dataAction.setParameterValue(parameterId, multiFilterValue.values);

You can see now how with just these few lines of code it is possible to say goodbye to the long queues of data actions in the UI and the need for users to enter identical parameters repeatedly as they proceed in their workflow.

(No more long queues of data actions in the UI.)


This API could be used in all events of any shape/widget - the possibilities are only limited by your imagination. Typical use cases will of course include using it on the onClick event of a button or onResultChanged of a table - but it is by no means limited to that. And it is worth mentioning that one of the greatest advantages that this API will bring, among the many, is the elimination of the ambiguity in data brought about by the possibility of user having forgotten a trigger or used it in the wrong order. This could now be ruled out.

To conclude - with the new API it is possible to bring the complexity in planning process below the surface and triggering automation precisely there where you need it. It extends your control over the data flow and hence also the data quality.

If you are interested in further related topics, do check out the links below: