Technology Blogs by Members
Explore a vibrant mix of technical expertise, industry insights, and tech buzz in member blogs covering SAP products, technology, and events. Get in the mix!
Showing results for 
Search instead for 
Did you mean: 

SAP planning developers can implement planning logic in SAP Analytics Cloud (SAC) by building data actions or multi-actions. Unfortunately, there are also situations where these tools are too limited. With the introduction of API steps and data loads in the multi-actions we can go beyond SAC to make our data calculations more powerful and establish more comprehensive planning flows. The blog series “Extending SAC planning” beautifully describes a practical example of how to set this up.

In this blog, I will investigate if it is possible to determine an approach for a SAC planning project to establish our planning logic for a few different use cases. Our first question always should be whether the SAC planning logic is adequate for our requirements, and if not, whether we should go for backend HANA modeling capabilities. This would allow us to use more advanced calculations like calculated columns, joins, and SQL window tables. And what about a hybrid way, where we combine data actions & multi-actions with HANA modeling capabilities?

I will investigate two use cases:

  • The standalone use case:SAC tenant with data actions & multi actions planning logic.

  • The hybrid use case: SAC tenant with data actions & multi-actions combined with Datasphere remote SQL views.


I will analyze the use cases from three perspectives.

  1. The complexity of setup: the prerequisites needed and the need for data replication or not;

  2. Power of calculations: Ability to create more comprehensive planning flows;

  3. Calculation optimizations: Ability to use optimization to deliver faster results to end users.

The complexity of setup

Standalone | SAC Hybrid | SAC & Datasphere
Place of data logic execution Data logic in SAC HANA database Data logic via Remote tables in Datasphere
Prerequisites SAC instance SAP Datasphere instance
Planning license needed Datasphere license needed
Data provisioning agent setup


In the standalone use case, the native HANA database executes all our calculations made in SAC. In that way it ensures seamless execution of calculations without the replication of data. No additional systems are required and the user should only have a SAC planning license to execute multi actions & data actions.

In the hybrid case, we encounter a more complex setup as we need to utilize a DP agent to access tables from SAC and make them visible in SAP Datasphere. That introduces potential calculation delays and a more complex system setup by establishing connections between SAC, the DP-agent, and Datasphere. Furthermore, a user must have an additional license to perform Datasphere calculations.
2 Power of calculations

Let's explore the various types of data logic that we can use in standalone and hybrid cases. The image also demonstrates how we can utilize REST API within SAC multi-actions, although this will be out of the scope of this analysis.

SAC-based planning logic

The image shows what types of calculations we can make in the data actions & multi-actions. SAC provides standardized calculations such as currency conversion, allocation & predictive steps. It makes sense to use these standardized objects as it requires more effort to build in Datasphere. However, it's important to note that these standardized logics may be limiting when it comes to more complex use cases.

For these use cases, we rely on advanced formulas step in the data actions that offer more flexibility. This scripting environment allows us to utilize logic like:

  • If/Then

  • For loops

  • Variables

  • Value lookups

  • Delete, append, and overwrite database records

The downside is that this scripting language is specific to SAC and not used in different IT tools. Less knowledge sharing and resources are available online compared to other scripting languages. Considering these factors, it raises the question of whether similar tasks can be accomplished in Datasphere using SQL-based views. The SQL language has more advanced functionalities, a largere knowledge base, and provides a visual user interface. We answer this question after discussing the optimization flows in the next paragraph.
Hybrid planning logic

In situations where advanced formulas steps are not applicable to meet the requirements, we definitely need more powerful calculations to accomplish our tasks. To mention a few use cases where the hybrid planning logic comes into play.

  1. If we need to link information from multiple models, the "Link models" functionality of advanced formulas steps may prove to be restrictive. We cannot use different joins between models, which is possible with SQL graphical views.

  2. Advanced formula steps have limitations when calculating based on dimensional properties. It becomes challenging to group multiple members with the same properties and then use different properties from another dimension within the same calculation.

  3. When making calculations based on real-time data, advanced formulas steps fall short as they rely on data stored in physical tables within SAC and not support live models. However, with a hybrid use case, we can utilize remote tables to perform calculations on the live models.

By adopting a hybrid planning approach, we can address these limitations and achieve more robust and flexible calculations by leveraging the power of SQL views and integrating them with our SAC planning data.
3 Calculation optimizations

Next, let's explore the optimizations we can achieve with our calculations to improve execution speed and minimize system impact. I'll examine the optimizations within SAC itself and then consider how they apply to the hybrid use case.

SAC-based planning logic

We begin with a physical SAC model with data where the recommended planning area (RPA) is enabled for optimization.

  1. Based on the user's writing rights specified by Data Access Controls (DAC), SAC creates a writing table when the user edits any data.

  2. This writing table is the basis for data action calculations, allowing them to be executed on a specific selection rather than the entire model's data.

  3. Any changes made by the user are recorded in a delta table, which speeds up the publishing process by limiting the action to only the delta records.

In this way, several optimizations are implemented by calculating numbers within SAC.
Hybrid planning logic

In the hybrid case, we use the same RPA and DAC settings as in the standalone case.


  1. Before the changes are visible in the remote table within Datasphere, we need to publish the user's input data first.

  2. Then, the DP agent creates a remote table in Datasphere, where calculations are performed using a SQL view. This introduces a delay compared to the standalone option.

  3. Afterward, the calculated data needs to be imported via the data load mechanism into SAC. The whole remote table needs to be initialized by SAC without the possibility of using delta loads based on updated data leading to further delays in the importing process.

In my opinion, this is a significant drawback of this approach, as it impacts the performance of the SAC system and increases waiting times for end-users to see the complete results.


In conclusion, when considering our standalone use case with SAC, it appears to be the preferred option for executing planning logic as it provides various SAC optimizations.

I previously questioned whether the hybrid logic could serve as an alternative to advanced formulas steps, given their increased power and familiarity with SQL. However, the hybrid solution cannot leverage optimizations such as delta loads, leading to increased system load and requiring more processing power. As a result, it is likely not a favorable alternative to the optimized advanced formulas steps.

Nevertheless, in complex scenarios where SAC as a standalone tool falls short, the hybrid proposition can be intriguing as it enables the implementation of more sophisticated planning logic if necessary, albeit at the expense of optimization. It is important to recognize the trade-off and accept a less streamlined process when opting for the hybrid approach.

Product and Topic Expert
Product and Topic Expert
0 Kudos

Hello mauritz89 thank you for a well-thought, timely and relevant blog. From your perspective, what would be needed in SAC (as additional features) so that the standalone option would always be preferred? Best regards Antoine Chabert (SAC Product Management)


Hi Antoine, 

Thank you for your comment and your question. I think there are two main reasons why we would go for the hybrid solution today: 

- limitations in data action functions to create complex calculations. 

- Calculations in planning logic that requires near-life data sources. 

Introduce native SQL views in SAC 

My first suggestion is to introduce an expert option next to data actions to write SQL statements natively in the SAC database, a bit like the SQL views in Datasphere. The data action functionalities work well for planning logic with a simple or medium complexity, although quite limiting for complex planning logic as it only offers 50/60 standard functions. Then we often go to a hybrid option as we can use more powerful tools by using advanced SQL functions. Introducing this feature would prevent us from using a backend tool for these planning cases and rely on standalone SAC. 

Make live models accessible for data action calculations

Second, make live models in SAC (maybe by replication tables) accessible for data action logic calculations. We currently can only refer to physical models in SAC to base our planning logic on. That means that for data actions we always need to trigger a data import to calculate near-live data. The Hybrid way allows us to calculate with live views via the remote tables, which is impossible in SAC nowadays. Introducing functionality to calculate with the views of live models would prevent us from making an ETL process for every single calculation requiring near-live data. 


Hi Antoine,

I’d like share my view as well if you gentilmen don’t mind.

Unfortunately i am with Mauritz about the functionality and flexibility of advance formula for the business logic calculations “especially in my country Turkey”. We are dealing with really complex process demands especially for production companies and this will be never changed because this is the way for our clients to manage their planning cycles. We were fine and feel comfortable with BPC solutions because we are able to run with abap and there is almost nothing that we can’t handle with abap. Now within SAC there is no abap editor, its okay we are fine to leave behind it, but not with advance formula. What if we have an java script editor instead of advance formula or java. Sounds perfect for me. Anyway we feel unconfortable in our workshops and was asking for the hybrid solution (bw4hana embedded bpc & sac). It is expensive for non sap customers. Anyway we are looking for the way to make our complex calculations since for a long time. And believe me we even use the custom widgets as an calculation engine just because of the java script. After data export and import API, now we are looking for the alternative hybrid solutions. since you give us a good coding area, we have to keep looking for alernative things. So thank you for the investments on custom widget and rest apis and multi action functionalities.But still not enough yet. But we are more confident than before.

Because SAC started to excite us. We are waiting for more good news.


SAP Datasphere and SAC are ripe for consolidation into a singular, more robust product offering. Regrettably, their divergent development timelines and roadmaps have led to two products with considerable data acquisition related overlapping functionality. For instance, the creation of models and data staging in certain scenarios are activities that are common to both.

Few of several compelling reasons that support this potential unification:

  • SAP Datasphere is versatile, offering comprehensive persistence and connectivity solutions for data. This means it can underpin any analytical application or story design, serving as the bedrock of our data infrastructure.
  • Existing planning applications could greatly benefit from the capability to leverage SQL, Python, and other such tools. This convergence will not only enhance their utility but also streamline their use in various business scenarios.

I believe the convergence of two products is not just a strategic decision, but a natural and inevitable progression.

Currently, SAP SAC Analytics supports adopted SAP Data Sphere. In a similar vein, planning models are also pushed to SAP Datasphere rather than SAC, allowing the power of SAP DS capabilities to be used to tackle all complicated situations.

still data locking / workflow have limitations to impliment complex solutions.

Thanks & Regards


Active Contributor
Hey Maurtiz, great blog & topic!

antoine.chabert is there a reason behind your question (what's needed for customers to always prefer standalone approach)? License and cost aside, customers will  always want the approach that is more capable to satisfy the requirements. So the answer is: almost everything DS can...

Datasphere has already become more flexible and powerful than SAC on many (technical & functional) aspects. So why would customers still want to start today with modelling & planning calculations in SAC standalone?
Product and Topic Expert
Product and Topic Expert
jefb I am interested to see people's respective views to the topic. It's great when a community... is a community 😉
Product and Topic Expert
Product and Topic Expert

For the record the SAP Analytics Cloud influence request that started it all - Forecast/Plan sensitivity & risk calculations Forecast/Plan sensitivity & risk calculations!



Active Contributor
antoine.chabert I was more aiming at how you phrased your question suggesting standalone approach enhancements. All comments here make it clear to me that we rather see a real integrated solution and not separate products anymore (with duplicate functionality & replication jobs), which today is forcing customers to choose.

On IR's, besides 292439 (which is a great use case) there are plenty of other IR's, all with different kind of use cases that can benefit from a real integrated platform. This blog might fit closer to 246614 but I suggest to also look into these 262943, 271417, 295160291875297669
Labels in this area