Technology Blogs by SAP
Learn how to extend and personalize SAP applications. Follow the SAP technology blog for insights into SAP BTP, ABAP, SAP Analytics Cloud, SAP HANA, and more.
cancel
Showing results for 
Search instead for 
Did you mean: 
Matthew_Shaw
Product and Topic Expert
Product and Topic Expert

What should the landscape architecture of SAP Analytics Cloud look like and how does it compare with traditional on-premise landscapes? How should I manage the life-cycle of content in SAP Analytics Cloud? I address these and many other common life-cycle management questions in my article.

I first look at typical on-premise landscapes and then compare this to the Cloud, what’s the same and what’s different. The final architecture is determined by many factors and I present the most important things you need to know, including summaries of:

    • Public and Private hosted editions
    • SAP Analytics Cloud Test Service and its Preview option
    • Quarterly Update Release and Fast Track Update Release cycles

I explain where each fits into the overall life-cycle, thus allowing you to determine the right architecture for your organisation. This includes explaining how and when you can transport objects between the different Services. Here between the Preview Service and the regular Quarterly Release Cycle (QRC):

12 Test Preview in overall landscape.jpg

and here between the Quarterly Release Cycle and the 'Fast Track':

14 Fast Track Transport Windows.jpg

(the 'Fast Track' is not provisioned by default)I also present why multiple environments are needed at all and what can be achieved, albeit in a limited fashion, within a single environment when managing the life-cycle of content. I present the typical landscape options chosen by most other customers and the most common path of adoption. Thus, giving you some guidance on what your next step on your SAP Analytics Cloud journey might be:

 

Content Namespaces are also an important concept to understand and I present some recommendations on how to manage this to avoid known issues and reduce the risk for your projects. I also present the options for transporting objects around the landscape and I finish with a Best Practice Summary. The article is available below and also in other formats:

Latest ArticleVersion 1.2.4 - January 2024
Microsoft PowerPointPreview Slides - version 1.2.4
Microsoft PowerPointDownload Slides - version 1.2.4
Video mp4Preview 1h 16mins - version 1.1
Video mp4Download 1h 16mins - version 1.1

Contents

[note: Sorry the links below no longer work due to a limitation with the new hosting platform]

 

Landscape Architecture: On-premise v Cloud

On-premise landscapes

‘Traditional’ on-premise landscape environments typically consist of

    • Two primary tiers
        • One for the application/database. E.g. SAP BW
        • One for the Business Intelligence or Planning system. E.g. SAP BusinessObjects BI Suite
    • Both primary tiers hosted on-premise
    • Each environment (Sandbox, Dev, QA, Prod) has a unique role with regard to the content life-cycle
    • So, what changes when working with a Cloud Service?

Same

What’s the same

    • Typically still need two primary tiers
        • Though some use cases can be fulfilled solely with SAP Analytics Cloud
    • Still need multiple environments (Sandbox, Dev, QA, Prod)
        • The role these environments provide remains valid, albeit now with a more obvious associated cost

Different

What’s different

    • The SAP Analytics Cloud services are updated on a scheduled basis
    • May have a desire for fewer Cloud environments to reduce Cloud Service fees
    • There could be cloud-to-on-premise dependencies
    • Impacts
        • life-cycle management activities
        • on-premise software update cycles

 

Landscape architecture

There are many factors that determine the landscape and these include

    • Choice of Public and Private hosted editions
    • Any use of SAP Analytics Cloud Test Service and its Preview option
    • Pros/Cons of Quarterly Release and Fast Track Update Release cycles
    • Options and limitations to promote content between the environments
    • Maturity of lifecycle management and change control practices

Public and Private Editions

    • Public Edition
        • Internal SAP HANA database instance is shared, with a schema per customer
        • Database performance could be impacted by other customer activity
    • Private Edition
        • Still has some shared components but the core SAP HANA database instance is dedicated
        • Dedicated database instance reduces the impact of other customers’ database activity
        • Typically chosen for larger enterprise deployments
    • Public Editions are typically chosen in general
        • but not exclusively since Private Editions have increased database performance protection from other customer services
    • Both ‘Public’ and ‘Private’ Editions are Public Cloud Services
        • There is no Private Cloud Service available, even for the ‘Private’ Edition
        • The ‘Private’ Edition is called private only because the SAP HANA database instance is dedicated

Bring Your Own Key

    • Customer Controlled Encryption Keys (CCEK) are only available for Private (not Public) Services
        • Includes Private Test Services (see later)
    • See KBA 3117993: How to use the "Bring Your Own Key (BYOK)" feature

Private Editions – a little more about sizing and performance

    • Analytic Models connecting to ‘live’ data sources require minimal hardware resources on SAP Analytics Cloud
        • Most of the load is on the data source
    • Analytic Models using acquired data means the underlying HANA database is used, mostly for read access so there is a load but it is not too heavy
        • Likely to require a higher CPU than a high Memory
    • Planning Models using acquired data can require significant resources for both CPU and Memory
        • Planning models are more highly dependent upon the model design
            • # of exception aggregation, # of dimensions, scope of the users’ security context when performing data entry
            • Physical size of a model isn’t necessarily the determining factor for performance!
    • Other factors: # of concurrent users, size and shape of the data, # of schedules etc.

Applicable for all environments

    • Public and Private editions are applicable to all environments
        • There are no restrictions on what type of edition can be used for any particular life-cycle purpose
        • However, performance load testing is only permitted against Private editions
        • Penetration testing is positively encouraged against all editions
            • For both load & penetration testing follow SAP Note 2249479 to ensure compliance with contractual agreements and so we don’t block you thinking it’s a denial of service attack!

Test Services

    • Provides unique opportunity designed for testing in mind
    • Includes named user Planning Professional licenses and a SAP Digital Boardroom license
        • Planning Professional includes Planning Standard license, and Planning Standard includes Business Intelligence named user license
        • However, there are no Analytics Hub users, nor any concurrent Business Intelligence licenses available with this service
    • Typically all Beta features, if requested and granted, can be enabled on Test Services
    • Cannot be used for Productive use
        • Does not adhere to standard SLA’s
        • Means you can not log a Priority 1 support incident with SAP
        • Does not mean it cannot connect to a Productive data source (i.e. you can connect it to a productive data source with a Test Service)
    • A Test Service is not required for Testing as such, a regular non-Test Service can be used for all non-Production life-cycle use cases
    • Test Services are available for both Public and Private editions
        • However, Public Test editions have restrictions with the Test Preview option described later

Quarterly Release Cycle (QRC)

Primary Update Release Cycle

 

9 Quarterly Release Cycle (QRC).jpg

    • There is one primary update release cycle, the ‘Quarterly Release Cycle’ (QRC)
    • If you purchased SAP Analytics Cloud, then your service will be on this update release cycle by default
    • It means your SAP Analytics Cloud Service is updated:
        • on a scheduled basis and you can not elect for it to be ‘not updated’
        • once a quarter, so 4 times a year

About the wave version number

    • For the year 2023, the Q1 QRC wave version is 2023.02, for 2023 Q2 is wave version is 2023.08 etc.
        • The wave version number is a combination of the year, version and patch number
        • E.g. version 2023.02.21 is version 2, of the year 2023, with patch 21
        • Waves and patches are always cumulative (they include all the features of everything before it)

Life-cycle use of QRC in overall landscape

10 QRC in overall landscape.jpg

 

    • For life-cycle management needs to be able to transport objects between Dev, QA and Prod environments at all times
        • The ability to transport content is critical to running any productive system
        • There should be no ‘blackout’ period, where transporting objects is not possible
    • Thus, it makes perfect sense for Dev, QA and Prod to be on the same Release Cycle
        • Content can always be transported between all environments, but most typically between Dev and QA, and Dev and Production (as shown in the diagram)

Test Preview Quarterly Update Release Cycle

Test Preview

11 Test Preview Quarterly Update Release Cycle.jpg

 

    • Only available with a Private Edition Test Service
        • i.e. not available for a Public Edition Test Service
    • This ‘Test Preview’ service receives the Quarterly Update Release, but ~1 month earlier
        • It’s a regular Private Edition Test Service, but the update release cycle is the ‘Quarterly Preview’
    • Provides a unique opportunity to validate the new version with productive data and productive content
        • Also enables central IT to test new features and get a ‘heads-up’ for the upcoming version
    • It is expected that ‘Test Preview’ connects to a Productive data source, even though it is classed as a Test Service. This is necessary to validate the new version with existing Productive content, both data source and SAP Analytics Cloud content (models, stories etc.)
    • A ‘Test Preview’ service, like a regular Test Service can not be used for productive purposes, but unlike a regular Test Service, it can also not be used for development purposes either

 

Life-cycle use of ‘Test Preview’ in overall landscape

12 Test Preview in overall landscape.jpg

    • ‘Test Preview’ introduces a new environment into the landscape, almost unique to the cloud
    • It is neither development, QA, pre-prod nor production
    • Preview should never be used for development purposes; its role is purely to validate new software with existing productive content
    • Since it is updated ~1 month ahead, for that month, you cannot transport content from it into another environment until those other environments are updated to the same version
        • i.e. you can only import into it (not export from it) during the month overlap
        • Typically you transport content from Production into Preview, but not exclusively
            • Dev content would also be transported into it at times
    • For 2 months of each quarter, it remains aligned with the Quarterly Release Cycle allowing you to easily transport content between all environments
        • Remember from a license contract point of view Test Preview can not be used for development or productive purposes

Fast Track Update Release Cycle

Fast Track

13 Fast Track Release Cycle.jpg

    • Updates are made ~ every 2 weeks, so about 26 times a year
    • Not provided by default and needs to be ordered specially
        • (SAP representative needs to request a ‘DevOps’ ticket to be raised prior to the order)
    • Up to 8 wave versions ahead of the Quarterly Release Cycle
        • As the version is considerably ahead its tricky to transports content in and out of it

14 Fast Track Transport Windows.jpg

    • Transport of content from Fast Track to others requires you to ‘hold’ the content until the target is updated to the same version
        • However the content must be the same or the previous version, so the ‘export’ needs to be performed in a small-time window, otherwise it is ‘blocked’!  (1) (3)
    • Transport of content into Fast Track from others is incredibly limited to a few times a year
        • ‘Older’ content can be transported into Fast Track, but the target can only be the Quarterly Release version, otherwise it is ‘blocked’!  (2) (3)
    • 1) Technically, if content is transported via the ‘Content Network’ then all earlier versions (not just the current or previous) can be imported, but it is not supported.
    • 2) Technically, if content is transported via the ‘Content Network’ then non-QRC versions can be imported, but it is not supported.
    • 3) If content has been exported ‘manually’ (via Menu-Deployment-Export) then it is not even technically possible to import it. Additionally, this manual method is required if the source and target services are hosted in mixture of SAP and non-SAP data centres. If the source and target are all non-SAP (or all SAP) data centres then the ‘Content Network’ can be used to transport content, even across geographic regions. The manual method is legacy and its support could be withdrawn in the future.

Life-cycle use of ‘Fast Track’ in overall landscape

 

    • Perfect for validating and testing new features that will come to the Quarterly Release later
    • Occasionally Beta features can be made available in non-test Services, allowing organisations to provide early feedback and allow for SAP to resolve issues ahead of general availability
    • Suitable for Sandbox testing only
    • Explicitly not suitable for productive or development of any content
    • Do not rely upon the ability to transport content into or out of Fast Track
    • For the Fast Track Update Release cycle:
        • Occasionally the update schedule changes in an ad-hoc fashion to cater for platform updates or other un-planned events
        • Rarely, but not completely unheard of, an entire update is skipped, so the next update updates by two versions. It’s possible the window of opportunity for transporting content is closed for some quarters

Update Release Cycles

16 Update Release Cycles.jpgFor SAP Analytics Cloud Services within the same update release cycle:

 

 

    • All services, public and private editions, are updated at the same time when they are hosted in the same data centre
    • Data centres are hosted all around the globe. Each has a schedule that will vary slightly from others
        • Quarterly Release Cycle schedule dates are published in SAP Note 2888562
        • Exact dates vary slightly by region and data centre host
            • SAP, Amazon Web Services, Alibaba Cloud (China) and Microsoft Azure
        • Some fluidity in the schedule is necessary for operational reasons so the update schedule for Fast Track is not published
        • Data centres are complex and updates are occasionally delayed to ensure service levels and maintenance windows are not breached. Delays can be caused by urgent service updates to the underlying infrastructure
    • It's important to ensure all your SAP Analytics Cloud Services are hosted in the same Data Centre to ensure version consistency with regard to life-cycle management

Why multiple environments

Objects relate to other objects by ID

    • Relationship of objects inside SAP Analytics Cloud is performed by identifiers
    • It is therefore not possible to manage the life-cycle of different objects independently of each other within the same service
    • For example, taking a copy of a story means that the copy will:
        • still point to the original model (red line in diagram)
        • not point to a copy of any model taken
        • lose dependences of all comments or bookmarks associated with the original
            • Other dependencies include SAP Digital Boardroom and discussions, but there are many more
        • have a new ‘ID’ itself so end-user browser-based bookmarks will still point to the original story
    • To re-point a copied story to a copied model requires considerable manual effort and is not appropriate for life-cycle management change control
    • Multiple environments are thus mandatory for proper change and version control

What life-cycle management can be achieved within a single Service

Some life-cycle aspects can be achieved within a single Service

    • Although objects dependent upon models (like Stories or Applications) cannot be independently managed within one ‘system’, some life-cycle aspects can be managed within one SAP Analytics Cloud Service
    • This typically applies to ‘live’ data sources
        • such as Data Warehouse Cloud, HANA, BW, BPC, S/4, BI Platform etc.
        • this tends not to be applicable to ‘acquired’ data models, but that is still possible
    • For example, when a new version of the data source (or copied SAC model) has been created, that new version can be validated in the same SAP Analytics Cloud Service
        • For ‘live’ data sources, the step of creating a new Model is simple, quick and easy
        • Creating a new Story with new, often simple, widgets can help validate ‘changed’ aspects of the data source (or copied SAC model)
        • It will thus, help validate changes made to the next data source version
        • ‘live’ data sources are typically hosted in other cloud services or on-premise systems
    • It would not validate the existing SAP Analytic Cloud content within that same Service, but it does provide a level of validation and opportunity to re-develop or adjust the model/data source design
    • Thus, albeit in a limited way, for the data source at least, two environments within a single SAP Analytics Cloud service can be supported

 

Simplifying the landscape

 

    • Some organisations have 4 systems in their on-premise landscape, however mirroring this setup is typically undesired with the Cloud as costs of more obvious
    • The option to validate data source changes within one SAP Analytics Cloud service is available
    • Means 4 data source systems can be managed with 3 SAP Analytics Cloud Services
        • Sandbox could be removed for this purpose - still may need it for another!
            • See Fast Track Services

    • Development validates new data source (and SAC model) changes by creating new story widgets that are soon destroyed
    • Once data source changes have been validated, the changes can then be made to the data source that supports all the existing content dependent upon it
    • This enables the development and validation of SAP Analytics Cloud content independently of the ‘next’ data source version that may be in development

 

Typical landscape options chosen

 

    • Arrows indicate most common path of adoption
    • First choice is typically Option 1
        • Useful for initial valuation
    • Next is to add a Dev environment, Option 2
        • Ideal for developing new content and starting to use the transport services available
    • ‘Test Preview’ is often followed, Option 4
        • Most customers find a need to validate new upcoming features with productive content
    • Sandbox options are typical for customers wishing to try out features way ahead of general availably. It reduces the risk to the project by having validated features ahead of time
    • Update Release Cycles are as shown
        • Sandbox and Preview are not on QRC
        • Dev, QA and Prod are on QRC
    • Non-productive environments only need a handful of users, perhaps 10 users or fewer
        • Will need at least as many developers/testers, but doesn’t necessarily need to be a significant investment
    • Due to license terms on Test Services, Preview has a minimum number of users: 20 for Public Editions and 50 for Private Editions
    • A minimum of 25 users is required for the Scheduling Publications feature (see blog for more details and an exception for Partners)
    • Typically, only the ‘Test Preview’ Service is using a ‘Test Service’, all others are non-Test Services including Sandbox, Dev, QA
    • Private editions are recommended for large enterprise deployments, commonly for production

 

Mixing Public and Private Editions

    • Public and Private Services can be mixed
    • There is no restriction except:
        • a Test Preview Service must be a Private Edition
    • The options shown here are just examples, there are many more supported combinations and permutations

Mixing regular Productive and Test Services

 

    • A regular (non-Test) ‘Productive’ SAP Analytics Cloud Service can be used for all environments with one exception:
        • ‘Test Preview’ must be a ‘Private Test’ Service (option C)
    • A Test Service can be used for all environments except ‘Prod’, otherwise any mixture is possible
    • A Test Service must never contain personal data for contractual reasons
        • If you need to ‘Dev’ or ‘QA’ with personal data, you’ll need a Production Service (option C)
    • The options shown here are just examples, there are many more supported combinations and permutations

Typical Landscape connectivity (for live sources)

 

    • It's very important to validate content against production size and quality data
        • Production data sources are accessed by Prod, but also Preview and occasionally Sandbox
        • Preview requires access to Prod data source to validate content with the upcoming wave version
            • Even though Preview is a ‘Test’ service, it can still connect to Prod data
        • Use a copy of Production data for QA purposes where possible
    • The user that performs the change to the connection, the user needs access to both the current and the new data source at the same time (hence the dotted line from Dev to QA and Prod)
        • Switch the connection before transporting new model versions
            • Since there is no concept of ‘connection mappings’ or ‘dynamic connection switching’
        • Switch the connection back after creating the transport unit
        • You only need to do this once and whilst it is presented here as changing connection in the source, its often done in the target
        • Please see the next section on ‘respecting the id’ and in particular best practices for managing live connections

Respect the ID and Best Practices for updating objects

Different objects, same name

Updating objects is performed by ID, not by name

    • When objects are transported from one environment to another, upon updating the target, objects are matched on the ID
        • If the ID’s match, the object can be updated
        • If there is no match by ID, the object will be new
    • Can lead to multiple objects with the same name in the target as shown in this workflow
    • It is impossible to create two new objects with the same ID, even across different environments
        • An object, when transported from one environment to another, maintains its ID (and its Content Namespace of where it was first created)

Best Practices for updating objects

 

    • In order to update an object created in Production with a new version in Dev, it must first be transported to Dev (Step 2)
    • The object can then be updated to the new version (Step 3) before being transported back to Prod (Step 4)
        • Typically it should go to QA first for testing!
    • Since the ‘ID’ has been respected, the object is matched by ID and is correctly updated
        • There is no duplicated object name in the target
        • All dependencies are also respected

Best Practices

    • Only create objects once and then transport them to other environments
    • Avoid ‘copying’ objects or ‘save as’ as this creates a new object with a new ID
    • Create objects with a name that doesn’t refer to its current environment
        • i.e. avoid “Test Finance”, instead use “Finance”
        • Applicable for all object types: folders, stories, teams, roles, models etc

 

Special notes for live connections

 

There are no exceptions and this includes live connection objects

    • Create the connection object once across the entire landscape, just like any other object (step 1)
    • Transport the live connection from the source environment to all other environments, just like all other objects
        • This ensures the connection id is the same and consistent across the entire landscape (step 2)
    • Update the connection details, within each environment so the connection points to the relevant data source for that environment (step 3)
    • Never promote the connection again (since doing so would overwrite the change you just made)
    • Then everything is easy:
        • As you promote models around the landscape, the models and stories will connect to the data source for the environment they are stored in

Special notes for live connections

    • Custom Groups created in stories are also stored inside stories, however, the Custom Group definitions are associated to the connection id, not the story!
    • This means, if Custom Groups are developed in Dev and transported to Prod, those Custom Groups will only work in Prod if the connection id is the same as Dev
    • Whilst you didn’t have to follow this best practice for managing connections before, since it was easy enough to just change the connection a model uses, now we have Custom Groups, it becomes necessary

Content Namespaces and Best Practices

Changing the default

 

    • Each SAP Analytics Cloud Service has a Content Namespace
    • Its default format is simple [Character].[Character]
        • Default examples include: t.2, t.0, t.O
    • You can change the Content Namespace to almost anything you like
        • e.g. ‘MyDevelopment’
        • but it's best to keep it to 3 characters to reduce payload size of metadata objects

    • When any content is created, it keeps the Content Namespace that Service had at the time of its creation
        • Like an objects’ ID, you cannot change the Content Namespace of an object
    • This also means, that when content is transported, it keeps that Content Namespace with it
        • As shown above, the model and story, when transported into QA, maintain the Content Namespace of Dev
    • It's likely any one Service will contain content with different Content Namespaces
        • For example, importing the Samples or Business Content from SAP or Partners, will have different Content Namespaces

Pros and cons of consistent values

    • Benefit of different Content Namespaces across environments:
        • You can identify in which environment the object was originally created
            • Though the namespace is hidden from end-users, you can see the namespace in logs and browser consoles
    • Benefits of the same and consistent Content Namespace across all environments:
        • Easier coding of Service APIs (REST, URL, SCIM)
            • These APIs often refer or include the Content Namespace, so a consistent value means slightly easier coding
        • Reduces project risk, typically when using APIs
            • There are occasional issues when multiple Content Namespaces are used, whilst SAP will fix these, it might be better to avoid any surprises by using a consistent value
    • Known issues when changing Content Namespaces
        • Teams with Custom Identity Provider
            • Once changed, new objects will have new Content Namespaces and this includes ‘Teams’
            • This also means the SCIM API used by Custom Identity Providers, if utilising the ‘Team’ attribute mapping could fail when Teams are using a mixture of Content Namespace values (KBA 2901506). A workaround is possible, but better to avoid it until a full solution is available

Best Practices

 

    • Do set the Content Namespace to be the same and consistent value across all environments
    • Change the Content Namespace as one of the first things you do when setting up your new Service
        • Do this before setting up any custom Identity Provider
        • Do this before creating any Teams
    • Set the value with a small number of characters and keep the format the same (e.g. T.0)
        • Lengthy values (e.g. MyWonderfulDevelopmentSystem) cause extra payload in communications and if the number of objects referred to in a call is large, this could have a performance impact
        • No need for a lengthy name, just keep it short and simple!
    • Do NOT change the Content Namespace if you already have a working productive landscape
        • Especially if you are using a Custom Identify Provider and mapping user attributes to Teams
        • Don’t fix something that ain’t broke

Transporting Objects, Best Practice and Recommendations

33 Option 1 and Option 2.jpg

 

 

 

There are 2 options for transporting content:

Option 1: Content Network (recommended)

  • ‘Unit’ files are hosted in the Cloud
    • Can be organised into folders, including folder security options
    • Processing occurs in the background
    • No need for any manual download/upload
      • though this is possible and still needed if transporting content across different data centres
  • Unlike (legacy) manual deployment option:
    • Supports a greater number of object types

 

  • Will only show Units that can be imported into the Service
    • Units created by newer versions of the Service will not be shown (i.e. Units created on Fast Track Release Cycle, but the current Service has yet to be updated to that version)
    • Older units that are not supported can still be imported, but a warning message is shown
      • Fully supported units are when the version is the same, the previous or the previous Quarter Release

Option 2: (legacy) ‘Deployment-Export/Import

    • Requires manual management of ‘Unit’ files: downloading from source and uploading to the target
    • Has limitations on file size
    • This option will be deprecated in the near future

 

    • For a little more detail on the best practices and a short demo of option 1, please visit this article and blog post for Q&A

 

Best Practice Summary

Key takeaways

    • Use multiple Services to life-cycle manage content
    • Do keep Dev, QA and Prod on the same update release cycle
    • Always validate content against production quantity and quality of data
    • Use the Test Preview to validate new wave versions with existing Production content
    • Use Fast Track to reduce risk to the project especially when success is dependent upon upcoming features
    • Respect the ID of objects and create content once, then transport it to other environments
    • Change the Content Namespace only at initial setup and set all Services to the same value
    • Use the Content Network to transport objects between Services

 

Frequently Asked Questions

    • Question: What is the best practice to manage the life-cycle of content with just one SAP Analytics Cloud Service?

 

 

    • Answer: SAP Analytics Cloud has been designed so that content life-cycle management requires multiple SAP Analytics Cloud services. It means a single SAP Analytics Cloud Service can not manage the life-cycle of content on its own
        • The service provides various tools by which content can be transported from one SAP Analytics Cloud Service to another and these tools are constantly being developed and improved
        • It is explicitly recommended not to attempt to manage the content life-cycle (of SAP Analytics Cloud content) within a single SAP Analytics Cloud Service

 

    • Question: Can I use Test Preview for ‘Dev’ and a regular productive service for ‘Prod’?

Test Preview as Dev.jpg

 

    • Answer: No
        • A Test Preview agreement states you can not use a Test Preview for development or productive purposes
        • Additionally, the schedule of the wave updates is not suited to support the life-cycle of content during the time the two release cycles are not aligned. It's critical, for good life-cycle management, to have the ability to transport content between Dev and Prod the whole time and not to have any black-out periods that could cause operational issues

 

    • Question: Is there a private cloud service available for SAP Analytics Cloud?

    • Answer: No
        • Both the ‘Public’ and ‘Private’ Editions of SAP Analytics Cloud are public cloud services
        • The term ‘Private’ is used to distinguish when the SAP HANA database is dedicated or not. A ‘Private’ edition is actually a public cloud service, just with a dedicated SAP HANA database, also hosted on a public cloud service

Your feedback is most welcome. Before posting a question, please read the article carefully and take note of any replies I’ve made to others. Please hit the ‘like’ button on this article or comments to indicate its usefulness. Many thanks

Matthew Shaw @MattShaw_on_BI

 

30 Comments
Henry_Banks
Product and Topic Expert
Product and Topic Expert
0 Kudos
Fantastic blog matthew.shaw

Thanks for taking the time to put this together.

I know this is a really complex topic, with different, wide-ranging experiences for our various customers.

This article really helps pull that all together, and will be perfect for a forthcoming discussion with one of my customers!

Kind regards

Henry
inigo_montoya
Participant
Thanks for the blog, Matthew!

I have also reviewed your interesting powerpoint document.

One comment I have: it would be usefull to include the impact of the QRC in the on-premise source systems.

I mean, for instance, when a live connection is used, I am sure you are aware that there are a number of notes that must be applied.

Then, I guess Test Preview should be pointed to a sandbox environment with the notes applied.

What is your reommended approach?

 

Thanks again for the insights,

Inigo Montoya
bosegsap
Participant
0 Kudos
Hi Matthew

Attended your webinar today, It was awesome. It's crystal clear

by the way I was wondering whether SAC does holds a dedicated architecture other than

SAC-Overview(attachment),similar to what BW holds( such as LSA,LSA++)

 

Regards

Bose
Matthew_Shaw
Product and Topic Expert
Product and Topic Expert
0 Kudos
Hello Inigo

Many thanks.

For the impact on on-premise, then please take a look at another blog on mine: https://blogs.sap.com/2020/06/19/sap-analytics-cloud-technical-and-administration-overview/

Any notes, typically needed only for new features, would need to be applied to the sandbox database (live data source using a live connection), yes. My other blog/video talks about this with a bunch of SAP notes to follow. But generally speaking your existing content should not stop working just because you're using a newer version of SAC. But having the sandbox gives you that re-assurance and if any action is needed then if you're using a Fast Track Sandbox or a Quarterly Release Preview gives you that opportunity to take action before the Quarterly Production/Development/QA environments are updated.

Typically speaking though, any on-premise updates are for defect or performance reasons, though not exclusively. For example the BI Platform sometimes requires a Support Pack update or an update to the Tomcat/JDK etc. So there are exceptions but if you're on HANA or BW then you only really need to be more concerned if you're on a very old version of BW or HANA where SAP might increase the minimum support level, but there's SAP notes to follow to give you plenty of time to plan accordingly.

Regards, Matthew
Matthew_Shaw
Product and Topic Expert
Product and Topic Expert
Many thanks Bose for your feedback.

There's no architecture diagrams as such, but I'll bear this feedback in mind as I can see there's a need for it. Could you describe more about what or why you'd like to see. I can then use this feedback to see what can be done. Many thanks indeed, Matthew
anirudhsv
Explorer
Hi Matthew,

your blogs are a one-stop solution for SAC questions! I find the lifecycle management of SAC content in a Live BW scenario quite unique and different from a BO-Server landscape.

Is there information available on how objects like Comments and Bookmarks are stored? Are they stored in separate files or in the same file as the Story? When the same story with no comments from the Dev. Tenant in transported using the Content network to the Prod. Tenant, where the story does have comments/bookmarks - Are the comments/bookmarks retained in the Prod. Tenant?

If yes, Does that imply the comments/Bookmarks for a story are stored in separate files?

If No, do we need to transport the story to Dev. tenant before any change is made to story so that objects like Comments/Bookmarks are retained?

Is there a plan to introduce automatic system switching for Live connection in the near future?

Looking forward to your feedback!

Best Regards,

Anirudh
bosegsap
Participant
Matthew

 

The reason being as part of our discussions with BW/HANA/B4/S4 regarding SAC Pros & Cons in conjunction with Business Objects SAC Architecture issue came up, hence this query

 

Best Regards-Bose

 
Matthew,

Thanks for the content, very helpful and comprehensive tour covering many aspects around SAC.

A question around your point 'a single SAP Analytics Cloud Service can not manage the life-cycle of content on its own', in our single-tenant landscape we are facing some of the limitations you describe in change and application life cycle management for our SAC stories, in your opinion which is the most cost-effective approach to implement a SAC Test environment and transport changes via Content Network?

We currently operate our SAP S/4HANA infrastructure in private Cloud and SAC Production tenant in cloud foundry, is best option to request a test tenant in the cloud or there are options to go on-premise or private cloud?

Regards,

David
former_member1161
Active Participant
matthew.shaw Thanks for this article, is there an update for 2021 quarterly releases?
Thanks
Yoav
Henry_Banks
Product and Topic Expert
Product and Topic Expert
Hi yoav.yahav1 nice to hear from you. Thought i'd chime in ahead of matthew.shaw incase he doesn't see this right away.

The 2021 quarterly releases don't change the landscape recommendations, but if you're looking for an updated release calendar, then they are here: Note 2888562 - Harmonized release calendar for SAP Cloud products

the scheduled release dates are:

  1. QRC1 weekend of February 20th,

  2. QRC2 weekend of 22nd May,

  3. QRC3 weekend of 21st August,

  4. QRC4 weekend of November 13th 2021.


This reflects the current state of planning but may be updated without notice.




  1. QRC1 includes bi-weekly waves 2020.22, 2020.23, 2021.01, 2021.02

  2. QRC2 includes bi-weekly waves 2021.03, 2021.04, 2021.05, 2021.06, 2021.07

  3. QRC3 includes bi-weekly waves 2021.08, 2021.09, 2021.10, 2021.11, 2021.12, 2021.13

  4. QRC4 includes bi-weekly waves 2021.14, 2021.15, 2021.16, 2021.17, 2021.18



again, this info above reflects the current state of planning but may be updated without notice. (i.e. some waves might be moved out to later QRC releases) 



Regards

Henry






former_member1161
Active Participant
Thanks henry.banks, this is very helpful!
former_member184709
Participant
Hi Matthew,

 

Is there an idea of a system copy in SAC?

Example, we want to take all of our objects in Production and copy them down into our Test tenant. (Rather than having to select specific objects).

 

Thanks!
Matthew_Shaw
Product and Topic Expert
Product and Topic Expert
0 Kudos

HI Derek

Good question. There is no system (service) copy tool, you need to use the Content Network (or the legacy Export/Import) to transport content from one service to another as you indicated. There's no equivalent 'copy repository' like you could do with say the BI Platform.

Regards, Matthew

former_member184709
Participant
Thank you Matthew !
Luis_Lopez
Discoverer

Hi Matthew,

Very useful and detailed content.

I have a question regarding Cloud vs On-Premise Landscape and the scenario where there are fewer SAC environments than on-premise systems.

For example, if we assume a two-tier landscape for SAC (Option 2 from the landscape options slide - with 1 Dev and 1 Prod tenant) and a 3-tier or 4-tier S/HANA landscape (Dev, QA, Pre-Prod, Prod).

For acquired connections (as we are looking to use SAC Planning functionalities):

  • What is the the recommended connection setup between SAC & S/4 environments?
    1. SAC Dev to S/4 Dev and QA  /  SAC Prod to S/4 Pre-Prod and Prod
    2. SAC Dev to S/4 Dev, QA and Pre-Prod  /  SAC Prod only to S/4 Prod
  • How would the development lifecycle work within the SAC Dev tenant?
    • For example, when a development is ready to be tested against QA (or Pre-Prod), is it just a matter of repointing the same connection to S/4HANA QA (in the connection settings) and reload the QA/Pre-Prod data?
  • The Landscape options mentions that SAC "QA environments are not so common", in this scenario would you recommend including a SAC QA tenant? Cost aside, what should be the main driver to decide whether to include a SAC QA tenant or not?
  • For the above questions, are there any different considerations for Live Connections?

Thanks in advance for your answers and any information you can share on this topic.

Regards,

Luis

Matthew_Shaw
Product and Topic Expert
Product and Topic Expert
Hello Luis,

Many thanks for your feedback and your question, its always a good idea to clarify things. Its a big topic this, but I shall give it a go...

Firstly, there’s a big difference between acquired and live models. Models based off acquired data require you to load that model with data, unlike a live model.

It means for a live model it’s as a simple as pointing it to a different connection and that’s it. (Though in practice you don’t do this, you need to follow a slightly different workflow that results in the same thing by re-using the same connection id, but that connection has different details inside it. You simply create a live connection, give it a generic name, and transport it from dev to prod. In prod you do nothing more than update that connection (not create a new one) to use the production data source, rather than the development one. Then all you need to do is transport ONLY the model, without the connection, each time. Then when the model is in the target, it will use the same connection ‘id’ (as in the source) but because you’d edited it to use a different server it will connect to production data).

Even then, a live model tends not to have much extra in it that it doesn’t automatically get from the data source. In many cases a live model may not need to be transported, but simply updated in the target, by re-reading its new definition from the data source. You do nothing more than edit the model and press save! You only need to transport a live model if you added something extra to it (like grouping measures for example).

For acquired models you have another aspect to manage, which is the data, not just the model definition.

For acquired data models (and data sets) then the data in the model is typically imported via a connection (though there are workflows that import data via an uploaded csv/xls file that then doesn’t use a connection).

That model can import data from anywhere! Though often you’ll limit the connections to only the ones associated to that environment (i.e., for SAC dev, you’ll probably have just ‘dev’ data connections). There’s nothing stopping you from importing data into your SAC dev model using a connection (still defined in SAC dev) but that is coming from a production data source. You might want to do this (in SAC dev, or SAC test) to really help ensure not only your model but any stories/applications are designed correctly for production data, even though you’re working in SAC dev or SAC test.

This should help to answer your question about a single SAC service ‘connecting’ to dev and qa. (“Connecting” isn’t really a great word for acquired models, since you run a job to import/export data to/from one or many different data sources of your choice)

For planning use-cases then its production that holds all those different versions of planning data and so the data itself doesn’t go through separate SAC landscapes dev-test-prod, it all happens in prod. You tend to then only need to worry about the model itself, not the data!

It’s generally best not to transport the model with its data, but instead without data because the size is problematic. Besides you tend to keep the data separate anyway between environments and once a model is in a new environment you can run jobs to import data for that environment. Simply de-select the option ‘Data’ as shown here for Model 2, to transport a model without its data:


If you’ve already exported a model with data, you can also choose not to import the data, but still import the model, at the time of import.

(There are times, for things like formula/hierarchies, when you do need to transport a model with its data but that’s outside the scope of this article/blog. Nevertheless, it’s on my to-do list for a deep-dive session later. For now, do a simple test of transporting the model without the data and check everything has reached the target. If something is missing you might need to transport with data)

This means, that in general models are transported without their data.

Your choice of a ‘QA’ SAC service is most likely driven by the need to validate stories, rather than for models. Models tend to be fairly static in nature, but of course they do change a little overtime and they tend to change a lot at the start of a project. You’ll need that QA environment when validating new stories/applications before transporting them to production. And you might need that QA environment to validate a model too, and to validate what will happen to the production model when you do transport it!

For live connections, additional environments become more critical simply because a single SAC model (and all its dependent stories/applications) can only use a single connection at any one time and there’s a need to test the data source itself because the stories are so dependent on it. This isn’t quite the same for an acquired model, as you just import the data and it’s the model that you need to validate. And stories built on an acquired model can be changed to use a different acquired model, unlike stories based on a live connection.

(Though the practice of copying an acquired model and its stories, to then update each and every story to point to that copied acquired model, isn’t something I’d recommend for life-cycle best practices and I would strongly discourage this. There’re too many other things that are dependent on the unique ID of the model (and the stories) remaining static - but it might be a poor workaround for the short-term if you wanted to keep the number of SAC environments to a minimum).

I would say QA environments are increasing in popularity as our customers maturity with the service improves over time.

I hope this has helped. I see, separately, you’ve been in touch with your SAP CEE and I’ll propose we have a quick chat to iron out any other points you might have. But thank you for posting here, as I’m sure others will benefit from the discussion.

All the best, Matthew
Niladri_B_Nayak
Active Contributor
0 Kudos
Excellent insight . Thanks for sharing .
0 Kudos
Hello und thank you for the very useful information Matthew!

Could you please share some more KBAs on these topics? The link to the KBA 2901506 regarding SAC namespaces is not working

Regards, Anna
Matthew_Shaw
Product and Topic Expert
Product and Topic Expert
Hello Kurmeleva

I've just checked https://launchpad.support.sap.com/#/notes/2901506 and it works for me. Could you try again, perhaps with a different browser please? Thanks, Matthew
peter_warren_uk
Explorer

This is the "Go-To" reference on this, thanks for the recent update

Given that all data actions and stories are linked with the model it would be difficult to have training based on the customers delivered planning system co existing in either Dev or QA tenant. I'd like to know if you would recommend having a small additional tenant just for training or whether there's a best practice solution.

 

https://answers.sap.com/questions/13610546/best-practice-training-environment.html

Would appreciate your view on this

Thanks

Peter

vtr1cob
Participant
matthew.shaw

Hello, Nice blog. Will be useful to decide how many SAC instance will be needed for our project- Sandbod, Dev, Quality, Production etc.

 

One question on the requirement from one of our customers - They want to use multiple Workspace in the same existing tenant ( SAC already live in one country). Now, they want to implement SAC planning in another geography using the same existing tenant, but with separate workspace.

Saw some blog from Andrew Situ that this is possible. Can we recommend the Customer to use workspace to segregate data and access to two geographies using different workspace in the same tenant?

Please let us know your thought on this.

 

 
Matthew_Shaw
Product and Topic Expert
Product and Topic Expert
Hello trvnath, Thank you for your feedback.

For a project that has 4 environments like yours (Sandbod, Dev, Quality, Production) then you'll need an SAP Analytics Cloud Service per environment, assuming you want to replicate the same life-cycle management change controls and processes for SAP Analytics Cloud as per your other systems/data sources. As I mention in the article, you may not need all 4, but the principal is you will.

SAP Analytics Cloud Workspaces is really a security question since Workspaces have no consequence or application for the purposes of life-cycle management of content. Why? because everything relates to everything (almost) by id and Workspaces don't change this. For security related questions, kindly either post a new question to the community, or ask a question on my security blog post (which needs a bit of an update, but still perfectly valid)

Regards, Matthew
Hi Matthew:

What challenges would you foresee with a 2 tier environment (DEV, PROD) versus a 4 tier (DEV, QA, PREPROD, PROD)? Are there any specific challenges one should be mindful of?

Great article, enjoyed the read.
harshgargaz
Discoverer
0 Kudos
Hello Matthew

I have a question related to Version management of artifacts transported under Content Network instead of File System export/Import. If we created one Export Job on Date X and same artifacts packaged on Date X+1, will they be same artifacts underneath with reference to same Model/Story ID being shared to other environments, or will they actually be the point in time snapshots of models/stories in packaged in different .package files?

br

harsh garg
JaySchwendemann
Active Contributor
0 Kudos
Hello Matthew,

excellent content. Many(!) thanks for your efforts on this trying to make a complex situation clear.

I just have a possible minor question: In the SAP Analytics Cloud – Landscape Architecture & Life-cycle Management | SAP Blogs (FAQ) section, shouldn't the screenshot point the other way around so "Test preview" and "Dev" should go to bottom while "Quarterly Release Cycle" and "Prod" go to top?

Cheers

Jens
Matthew_Shaw
Product and Topic Expert
Product and Topic Expert

Hi Jens

Thank you for your feedback. Yes, you are spot on. I will fix that confusing image right away. Thanks so much for spotting it

Thanks again, Matthew

[update] Image is now corrected. Actually, I've put it so Prod is at the bottom (on QRC) and the Dev at the top (on Test Preview)

Wu
Product and Topic Expert
Product and Topic Expert
0 Kudos
Hello Matthew.

This article has mentioned "Connection ID" several times. What is the ID of a connection? Is it the name of the connection or the Full Object ID that we can check based on RESOURCE model of the system overview folder?

As my test, I delete a live connection and then create a new one with the exact same connection name and details, the live model can recognize the new connection as well. I assume the ID of the new connection I created is different with the old one.

So could I understand the way that the name of the connection can also be treated as another "connection ID"?

Many thanks ahead!

Wu Dongxue
Matthew_Shaw
Product and Topic Expert
Product and Topic Expert
0 Kudos
Hello Wu,

For a connection, the connection ID is the name of the connection. It means if you delete a connection and create it again with the same name, it will have the same connection ID as the one you deleted. Connections (and a few other exceptions) are unusual in this sense because other objects don't behave like this. Other objects will be given a new ID, even if they share the same name as they did before.

My emphasis to 'create a connection once and then transport it', doesn't strictly apply to a connection, but the principle is important and it might lead to thinking it could be done with other objects when this is not the case. Additionally, we don't know when, or even if, the product will change which would then mean a connection gets a new unique ID when created, rather than one currently determined by its name. We need to avoid surprises, so it makes sense for any practice to follow how the product is intended to work.

Hope this helps, Matthew
Wu
Product and Topic Expert
Product and Topic Expert
0 Kudos
Hello Matthew.

Understood. Thank you so much for the reply and support 💙. My honor to learn from senior expert like you.

Many thanks for your time and best regards,

Wu Dongxue
gradc19
Explorer
0 Kudos

Excellent content. single source of entire SAC landscape... Thank you very much for putting all together and response comments. It helps a lot.