Technology Blog Posts by SAP
Learn how to extend and personalize SAP applications. Follow the SAP technology blog for insights into SAP BTP, ABAP, SAP Analytics Cloud, SAP HANA, and more.
cancel
Showing results for 
Search instead for 
Did you mean: 
Max_Gander
Product and Topic Expert
Product and Topic Expert
12,003

Introduction

This blogpost aims to give you an overview on the functionality offered by the seamless planning integration between SAP Analytics Cloud and SAP Datasphere. I already published a blogpost giving an overview of seamless planning and announcing the controlled release. It also covers the prerequisites for participation and usage of seamless planning. I recommend reading that one first. You can find it here.

Please, ask your questions in the comments and I will continuously amend and update this blogpost. Once the initial release comes closer, I will also provide you with an example use case.

 

What is seamless planning?

Please, first read here.

 

How does SAP Business Data Cloud affect seamless planning?

Seamless planning is all about the integration of SAP Analytics Cloud and SAP Datasphere for planning purposes. Both SAP Analytics Cloud and SAP Datasphere are components of SAP Business Data Cloud so seamless planning is highly relevant in the BDC context as well and delivers the means to integrate plan data deeply into BDC. 

Seamless planning does not require a BDC license or installation. However, BDC can add a lot of value to seamless planning in the future by enabling planning-enabled insight apps, extended planning architectures (SAP Databricks, BW PCE) or the consumption of data products in planning.

 

Connecting/linking SAP Analytics Cloud and SAP Datasphere

What must I change in my tenant to use seamless planning?

Once seamless planning is available to you, the system owner of SAP Analytics Cloud and SAP Datasphere needs to link the tenants. Linking will enable the selection of SAP Datasphere spaces as the data storage location for the supported object types.

 

What must customers know before linking SAP Analytics Cloud and SAP Datasphere?

Besides the prerequisites listed under General Questions, SAP Analytics Cloud and SAP Datasphere tenants can only be linked in a 1:1 relationship. One SAP Analytics Cloud tenant can be linked with only one Datasphere tenant. Customers must evaluate and carefully plan what this means for their specific landscape and lifecycle management.

In the future, we will aim to offer more flexibility regarding tenant relationships; however, there is no timeline for potential enrichment in this area.

 

Model deployment to SAP Datasphere

How do I deploy a model to SAP Datasphere?

Whenever you create a new model, you will be able to select the data storage location. This can be SAP Analytics Cloud (which means not using seamless planning) or SAP Datasphere (which means using seamless planning). If you choose SAP Datasphere, you need to choose a space.

You can also choose the data storage location when importing content via the Content Network in SAP Analytics Cloud.

 

Can I deploy Classic Account Models to SAP Datasphere?

No, Classic Account Models are not supported and will not be supported in the future.

 

Can I migrate existing models and content to the new architecture?

Migration support tooling is not available yet. This is considered a priority for future enhancements though.

 

What is the approach for public dimension tables and conversion tables?

When creating public dimension tables and conversion tables, you will also be prompted to select the data storage location. Public dimension tables’ and conversion tables’ data storage location must match the model’s data storage location to be used in the model.

 

Data management

How will data management be handled in a seamless planning setup?

With the initial release, data management will be handled using SAP Analytics Cloud’s data management capabilities, such as the data acquisition framework, data import service, and data export service. This means that Datasphere will not be allowed to write data directly into SAP Analytics Cloud’s tables using SAP Datasphere’s data integration tools, such as transformation flows. This is a potential future enhancement.

The initial release does not support live integration of fact data available in SAP Datasphere into the seamless planning model. This enhancement currently is under development. Hence, right now data has to be loaded via OData services or alternatively, via a HANA connection in combination with a free-hand query.

 

How will master data be managed between SAP Analytics Cloud and SAP Datasphere?

SAP Analytics Cloud will continue using SAP Analytics Cloud managed dimensions to store and model master data required for planning. In the future, we want to allow SAP Analytics Cloud to consume dimensions previously created in SAP Datasphere.

The initial release does not support live integration of master data available in SAP Datasphere into the seamless planning model. Hence, right now data has to be loaded via OData services or alternatively, via a HANA connection in combination with a free-hand query.

 

Planning & Modeling Features

Are all SAP Analytics Cloud planning and modeling features available for seamless planning models?

Yes. All known planning and modeling features are or shall be available for models deployed to SAP Datasphere. We have very few restrictions for the controlled release that are listed in this blogpost. 

It is important to understand the you still build your model in SAP Analytics Cloud even if you choose to store their data in SAP Datasphere. All planning features are supported on them. 

 

Are SAP Analytics Cloud predictive scenarios supported?

Yes, you can run predictive forecasts for your planning models deployed to SAP Datasphere. Independtly from seamless planning, we also plan to release regressions and classifications on SAP Analytics Cloud new models in the future. Again, this shall also be available for models deployed to SAP Datasphere.

Predictive scenarios are an area to particularly benefit from seamless planning in the future thanks to more seamless access to data from the SAP Datasphere data marketplace and historic data.

 

Is SAP Analytics Cloud, add-in for Microsoft Office supported?

Yes, this is supported. 

 

Is Data Point Commenting supported?

Yes, you can use the known data point commenting on SAP Analytics Cloud models that are deployed to SAP Datasphere.
Note that commenting on live connections to SAP Datasphere analytic models for reporting is not supported yet and a potential future enhancement (independently from seamless planning). 

 

Consumption in SAP Datasphere

How are SAP Analytics Cloud artefacts exposed and consumed in SAP Datasphere?

For the model’s fact table and public dimension tables, you can choose to expose them in the data builder.

SAP Analytics Cloud models will expose the underlying data foundation as a “Local Table (Fact)”, while the public dimension tables will expose the master data as a “Local Table (Dimension)” associated with a translation table (storing the multi-language descriptions) and, in future, hierarchy tables. Then, SAP Datasphere can use SAP Analytics Cloud-exposed objects in graphical views, SQL views, analytics models, transformation flows, etc.

 

Can SAP Analytics Cloud objects be changed directly in SAP Datasphere?

SAP Analytics Cloud objects will appear in read-only mode in SAP Datasphere, meaning that SAP Datasphere modelers cannot make structural changes to these objects.

 

Can planning data be shared outside the space in SAP Datasphere?

Once SAP Analytics Cloud’s models and public dimensions are exposed in SAP Datasphere, SAP Datasphere modelers can choose to share these objects with other spaces using SAP Datasphere’s sharing functionality.

 

Resource Utilization

Which product's resources are consumed by seamless planning models?

With seamless planning, planning models are deployed to SAP Datasphere spaces. So, they are stored on SAP Datasphere's database. Planning and modeling activities like publishing, running data actions, importing data etc. are running on SAP Datasphere’s database and consume memory and CPU power there. 

Hence, it is important to configure your SAP Datasphere tenants adequately for seamless planning. SAP Analytics Cloud's tenant size is not a performance-relevant factor for seamless planning models.

 

How should I configure/size my SAP Datasphere tenant for seamless planning? 

SAP Datasphere allows flexible tenant configuration. Check the SAP Datasphere Capacity Unit Estimator to learn about the available configurations which also include high-compute set-ups. More information is available in the SAP Datasphere help and in this blogpost.

Two things to note before we discuss sizing implications wrt seamless planning:

  1. It is impossible to give specific sizing recommendations due to the various factors that influence the required hardware (concurrent users, size of planning areas, number of measures and calculations, number of dimensions, complexity of data actions, number of accounts, cube density, and many more). Hence, no responsibility is taken with regards to the accuracy of the information in this blogpost for your use case. 
  2. No matter which hardware is available, modeling best practices must be followed. Hardware alone does not provide good performance. 

Planning scenarios require adequate hardware. Attention should mostly be paid to the number of CPU cores and memory. 

You can grow your SAP Datasphere installation over time to find the right configuration and cater for growing adoption. However, note that minimal SAP Datasphere configurations may be too small for planning scenarios, even with few concurrent users. 
As a rough estimation, for planning installations in which short-term needs are in excess of 100 concurrent planning users we recommend 512GB memory and 64 CPU cores as a starting point. Be reminded that this information is high-level only and no responsibility is taken with regards to its accuracy for your use case.

 

Can I optimize my SAP Datasphere space for planning?

Apart from the accurate configuration of the SAP Datasphere tenant, you should do the following:

1. If you use space quotas, assign an accurate space quotas in Space Management, especially for the available memory.

2. Per space used for seamless planning, we recommend to do the following settings in the Workload Management section under System - Configuration:

  • 100% total thread limit for space
  • 90% total memory limit 

Max_Gander_0-1746620035074.gif

 

 

Can I monitor the effect of my planning models and activities on my SAP Datasphere tenant?

You can use SAP Datasphere's system monitors to track the loads generated by seamless planning models. More details can be found in this blogpost written by @fenja_schulz .

 

Planning Activities

Can I change plan data directly in SAP Datasphere?

SAP Analytics Cloud application interfaces will remain in control of the model structure and all data changes. Direct write access to the planning fact table via SAP Datasphere will not be possible.  However, we consider a number of future enhancements to consume data from SAP Datasphere in the planning model.   

 

Could Datasphere extend SAP Analytics Cloud planning capabilities by accessing libraries of algorithms from Python, Azure, Pandas, etc.?

Programming languages like Python can extend Datasphere in many areas, including ETL, reporting, and analysis, and provide access to libraries like Pandas and NumPy. A good overview can be found here. Additionally, this blogpost shows how to implement a SAP Business AI project with a time-series forecast by using the embedded Predictive Analysis Library.

 

Access, roles, security

What roles are required for the users?

Let’s look at this from different angles.

  • Modelers store the model in SAP Datasphere and build SAP Datasphere assets on top:
    • SAP Analytics Cloud Modeler and SAP Datasphere DW Modeler or equivalent role to create models and modeling assets in SAP Analytics Cloud, and save them in an SAP Datasphere space, with Read & Create permission on the SAP Datasphere data builder.
    • SAP Datasphere DW Modeler role or equivalent privileges to create views and analytical models in SAP Datasphere.
    • SAP Datasphere DW Consumer role or equivalent to view objects in spaces.
  • Planners need to have a user in SAP Datasphere. Besides that, they require the known roles and data access in SAP Analytics Cloud (Data Access Control or roles; no change in the context of seamless planning).
  • Reporting on top of analytic models is secured via SAP Datasphere, i.e. Read access and SAP Datasphere Data Access Control. Besides that, they require the known roles in SAP Analytics Cloud to see stories. This has not changed with seamless planning.

How is data authorization for the planning model handled?

For the planning model, data access control and roles/model data privacy remain the tools to secure data.

A harmonization is a potential future enhancement.

 

Will there be common user management across SAP Analytics Cloud and SAP Datasphere?

Currently, no common user management is planned. To streamline the user management process based on the existing setup, customers could configure a custom central SAML identity provider to propagate user changes across SAP Analytics Cloud and SAP Datasphere. More info can be found here.

 

Restrictions

Are there any feature imparities when using seamless planning?

Note the following feature imparities during initial release compared to the features offered for models that are not using seamless planning (updated April 2025). We will try to lift them asap:

  • The following features of Data Export Service are unavailable (planned):
    • Delta subscriptions
    • Public dimension tables
    • Audit tables
  • The following features of the Data Import API are unavailable (planned):
    • Public dimension tables
    • Private versions
  • SAP Analytics Cloud compass (planned)

 

What are important functional restrictions in the current scope of seamless planning?

I’d like to point out these functional restrictions that we want to remove in releases after the controlled release:  

  • Having shared public dimensions across models is only possible when the models and the public dimensions are deployed to the same SAP Datasphere space. The same applies to the currency rate tables.
  • Hierarchies are not exposed to SAP Datasphere. If they are needed, SAP Datasphere modelers are required to rebuild them.
  • Cross-model planning activities, such as data action cross-model copy step, are only allowed if the required models are deployed to the same data persistence (i.e. all in SAP Analytics Cloud or all in SAP Datasphere), and in the case of objects deployed to SAP Datasphere, to the same SAP Datasphere space.
  • [Bug]: Fact and dimension tables that are exposed lose the exposure setting when transported/imported via Content Network.

 

Are there features that will not be supported on seamless planning models?

Indeed, there are features that will not be supported for Seamless Planning models. There are no plans to support them in the future as they are being phased out of SAP Analytics Cloud:

  • Classis Design Experience. You can only consume Seamless Planning models in Optimized Design Experience.
  • Input Tasks. Use the SAP Analytics Cloud Calendar instead to coordinate your planning activities.

 

Future scope

What features are planned for the future?

First and foremost, we want to remove the functional restrictions outlined above. Then, we plan highly value-adding features to strengthen the cross-consumption of data, streamline modeling efforts and the orchestration of seamless planning use cases. We have many ideas to strengthen the integration even more. These are just some of them:

  • (Live) Cross-consumption of SAP Datasphere data
  • Migration support for existing SAP Analytics Cloud content to the new architecture
  • Re-use of dimensions from SAP Datasphere
  • Cross-orchestration of workflows between SAP Analytics Cloud and SAP Datasphere with multi actions and task chains

 

Licensing

What licenses are required to use seamless planning?

You need to own licenses for both SAP Analytics Cloud and SAP Datasphere. SAP Analytics Cloud is predominantly licensed via user-based model whereas SAP Datasphere is licensed by capacity. In the seamless planning scenario, you would license the planning functionality via SAP Analytics Cloud users and license the hardware (storage, memory, compute) required for planning via SAP Datasphere capacity units. See above under Resource Utilization.

 

Conclusion

Seamless planning is a big change and a big opportunity. Take the time to think about what it can mean for you and let us know your questions in the comments! As said, I will continuously amend and update this blogpost.

 

16 Comments
Sebastian_Gesiarz
Active Participant
0 Kudos

This means that Datasphere will not be allowed to write data directly into SAP Analytics Cloud’s tables using SAP Datasphere’s data integration tools, such as transformation flows. This is a potential future enhancement.

Could you please shed some light on the possibility of loading actual data from SAP Datasphere into an SAC planning model hosted on SAP Datasphere? Will a new connection type be available?

 

Max_Gander
Product and Topic Expert
Product and Topic Expert

Hi @Sebastian_Gesiarz ,

In the first release, you need to load data into the SAP Analytics Cloud using the known methods.

On the one hand side, we are indeed planning a distinct connection type for SAP Datasphere. However, I think this will be a secondary option in the context of seamless planning. The reason is that we are already working on live data access to fact data available in the space that the SAP Analytics Cloud model is deployed in. 

Sebastian_Gesiarz
Active Participant

@Max_Gander Thank you, does a known method mean that even though the data will be on the same DSP tenant, we will need to use OData from the DSP model to load the data to the SAC model? 

As for the future scenario, if the live data access from other DSP models sharing the same space will be available for the SAC, how will this data be added to the SAC planning model hosted in the same space?   

Max_Gander
Product and Topic Expert
Product and Topic Expert

@Sebastian_Gesiarz Yes, but as said, I hope this will change rather soon. 

In the future, you would be able to select a Datasphere fact table/view via data management in the SAP Analytics Cloud model and have the choice between live access and a copy/import of the data into the model.

fabianrunge1
Participant
0 Kudos

Hi @Max_Gander,

could there be performance benefits whenusing Datasphere as the model is living in the HANA DB of Datasphere and the ressources can be scaled up (whereas in SAC this is not something we as developers can control)? I'm thinking about a very complicated table in SAC with many restricted keyfigures and a big G/L account that users are consuming. These can get slow in certain conditions at the moment. Could this be faster when the model is deployed in Datasphere? Or even slower when not enough ressources are available?

Best

Fabian

Max_Gander
Product and Topic Expert
Product and Topic Expert
0 Kudos

Hi @fabianrunge1 ,

In principal with same available resources there should be performance parity between models stored in SAP Analytics Cloud and models stored in SAP Datasphere. But, you are right, you can scale up SAP Datasphere in a more flexible way than SAP Analytics Cloud where we have certain options for dedicated tenants (e.g. 512GB memory). We also want to add workload management capabilities on space level so that you could control the resources consumed by SAP Analytics Cloud workloads on the SAP Datasphere database.

BR
Max

 

MKreitlein
Active Contributor

Hello @Max_Gander 

Kudos that we finally are going to learn about how the Seamless planning is coming to life and what it means in technical terms. 😄

I don't know, but somehow I had a different imagination, of the final result... I thought we would also achieve a really "seamless" data storage and loading.

One example: Since very often the granularity between Actuals and Plan data is different, we need to aggregate the Actuals data to the lowest common denominator in another table. In the past we did this in BW and loaded that data into the SAC Planning model (version Actuals).

Now, when we prepare this data in Datasphere, I thought we either could load it directly into the SAC Planning Table in Datasphere, without the need to load it into SAC (Planning model), which then is just another table next to the one, where the data originates from 🙈

Or we even could use a Union Fact View for the Planning & aggregated Actuals tables to expose this (just like a BW Composite Provider) to SAC for planning purposes. So the former Planning model would be a shell, reading the Actuals version from one table, and the Plan version from another. So, like in BW, you would define one table (DSO) as write-enabled.

This was how I imagined the upcoming solution. 😦

Question: If we combined the DSP SAC Plan table in a Union Fact View with another table and create an analytical model on top, how would be the version handling? Are we able to define the same column in the 2nd DSP table for the Union and work with it in a Live Story? Or will these planning tables not be consumeable via Fact View and Analytic model?

Thanks and BR,

Martin

 

 

Max_Gander
Product and Topic Expert
Product and Topic Expert

Hi @MKreitlein ,

 

You are addressing two main points if I understand correctly.

1. No possibility to load from DSP into planning tables (SAC model tables) directly. 

--> We plan to enable SAP Analytics Cloud models as targets for SAP Datasphere's transformation flows. No timeline yet

2. Virtual access to the data in SAP Datasphere (what you referred to as a composite provider like concept)

--> We are already working on a virtual consumption of data residing in SAP Datasphere for the planning models. You would be able to integrate external versions like actuals (could be plan data from another seamless planning model as well) from another fact table/view in the Datasphere space that the model is deployed to into the planning model. You cannot change this external data but you can reference it for planning. I.e., you can copy it to a plan version in your planning model or reference it in calculations, data actions...

 

With these mentioned features, I hope we get close to what you envision.

Finally on your question: You can consume the planning facts in SAP Datasphere views. We are currently designing the semantic re-use for the version dimension so that you can expose it as a master data table in the data builder and match it with your Datasphere data in which you will also have the version as a semantic type.

BR
Max

Savio_Dmello
Active Participant
0 Kudos

Thanks for Sharing

maxboo
Explorer

Hi Max,

the big pain point still exists. The business wants reference data (actuals) while they input plan data.

Therefor we will still need to somehow make all the master data and all the actual transactional data available in the SAC Planning models. This means via Import connection from e.g. SAP BW via Cloud Connector and Analytics Agent we have a lot of replication flows / monitoring taks which is prone to errors.

The only thing which more easier now is reporting the plan data with the actual data by doing a union via analytic model in Datasphere and consuming this in SAC Story.  So we only got rid of the Export of plan data to the backend system (e.g. BW) where you union the actuals and plan via a Composite Provider and Report via Query.

Hoped for little bit more. 

 


@Max_Gander wrote:

We plan to enable SAP Analytics Cloud models as targets for SAP Datasphere's transformation flows. No timeline yet

Please work on this ASAP as the actual import to SAP SAC is currently a real pain for our customers!

Also, by having the modeling still residing in SAC we are bound to to the captivated modeling features in SAC and not have the capabilities of Datasphere - From my perspective this is a wrong design decision (maybe due to technical restrictions).

 

Regards from another,

Max

 

MXK
Explorer

@Max_Gander wrote:

We plan to enable SAP Analytics Cloud models as targets for SAP Datasphere's transformation flows. No timeline yet 


I also hoped for a lot more. tbh ... 


@maxboo wrote:

The business wants reference data (actuals) while they input plan data.

Therefor we will still need to somehow make all the master data and all the actual transactional data available


I can only fully acknowledge and join @maxboo and @MKreitlein 's opinions. This does not make much sense after all. At the moment "Seamless Planning" is more like a marketing buzzword, not really coming close to something like seamless at all.  Even the feature integration with controlled release / GA is a pain when reading for existing customers. But ok. Let it be like that.

The need for import connection - what benefit should that bring? Should it not be the other way around: to leverage Datasphere as MKreitlein and maxboo said to have the Union on the scalable database side with more possibilities to transform the data or work with ML on it.

@Max_Gander wrote:

2. Virtual access to the data in SAP Datasphere (what you referred to as a composite provider like concept)

--> We are already working on a virtual consumption of data residing in SAP Datasphere for the planning models. You would be able to integrate external versions like actuals (could be plan data from another seamless planning model as well) from another fact table/view in the Datasphere space that the model is deployed to into the planning model. You cannot change this external data but you can reference it for planning. I.e., you can copy it to a plan version in your planning model or reference it in calculations, data actions...


Well this is 'the' feature we all hoped for to have plan to actuals combined, without much hassle in between. Virtual consumption in a planning model? this sounds weird. If this is the "union" of reference /actual data as thought, why to bother and do that? isn't it easier as implementer to leverage blending models (one planning, one actual from live conn.) then and blame performance on the user client. 

I understand the good intentions, but does "referencing data for planning" always means "copy it it to a plan version" ?

In the end i still don't get the product vision in terms of how to get away from copy actions and buying more and more storage units because of replications over and over? as already said (not by me), those replications make it cumbersome and $ - is the latter that the product vision?

Max_Gander
Product and Topic Expert
Product and Topic Expert

Hi @MXK @maxboo 

We are now taking a first step and it is a big one. We are all aligned in the sense that we see a lot more opportunities on top of that. 

We are fully committed to make further enhancements in 2025 that ease the integration of reference data from DSP tables (I'll share previews as soon as I can).
You shall be able to just display the data in the planning tables, copy them to plan, use them in data action etc. Not sure if that became clear. 

We don't want import connections in the end vision of seamless planning. Replication for planning purposes shall be a choice for you guys - not an obligation.

BR
Max

Alok92
Discoverer
0 Kudos

Thanks a lot!

albertosimeoni
Participant
0 Kudos

Hello,

I Found that the source of all this pain is this: DB users created inside a space can not write into space schemas that are "reserved to datasphere technical DB users".

This limitation "made probably to avoid third party tools" to use Datasphere as a database acts like a boomerang between two SAP products (no transformation flow write to SAC models, SAC write back into an openSQL schema).

this is all to guarantee audit control of user activity that with database users is not captured in the integration monitor:

So we are sacrificing functionalities to not broke the "audit of user activities" and "not open Datasphere to third party tools".

Do we really want this?

By the way, the workaround is : create a modeling view + create a replication flow with a connection to the DB user schema to update the model. (if the full update is what you want).

But again another potential issue: only replication flows with full update can be scheduled.

Max_Gander
Product and Topic Expert
Product and Topic Expert
0 Kudos

Hi @albertosimeoni 

I must admit I am not sure what unsupported scenarios you are referring to. Can you take a step back and explain with more context?

Thank you
Max

albertosimeoni
Participant
0 Kudos

Hello @Max_Gander ,

a simple use case:

If I have a third party ETL tool, and I have created a local table from Datasphere UI,

I can not write to that table with an external ETL tool (SAP or non SAP).

As the table is in a DB schema managed by the application (Datasphere) and related to a Space. (the schema has the same name of the Space technical name).

This is probably done to guarantee Audit of operations made in Datasphere UI, and this is a boomeerang in the difficulty of integration between SAC and Datasphere, we can see that this is happening as SAC seamless planning is writing on an OpenSQL Schema instead of a Datasphere Space schema.
the point is Datasphere is an application build on top of a HANA Cloud database, the database is locked and managed by the application => not open to every use case.