Technology Blogs by SAP
Learn how to extend and personalize SAP applications. Follow the SAP technology blog for insights into SAP BTP, ABAP, SAP Analytics Cloud, SAP HANA, and more.
Showing results for 
Search instead for 
Did you mean: 
Community Manager
Community Manager
Yesterday I got an email from my dear colleague Patrick O'Brian of the SAP BTP SDK for iOS Team. He has informed me that they've just released a brand new open source repository containing an implementation of ARKit APIs and controls following the SAP Fiori for iOS Design Guidelines.

With this package being released and being open source you, as an iOS developer, has now the possibility to build Fiori style native business apps utilising Apple's ARKit technology. With ARKit, Apple gives us the opportunity to use the power of augmented reality to enhance the experience of our apps to a new level. With ARKit being in version 5 available the APIs are there for a while now but what the release of the SAP Fiori for iOS ARKit brings to the table is a set of APIs and ARKit ready UI controls to enhance your business apps with meaningful and exciting functionality! You don't have to create these assets by yourself, think about integration or deep implementation as the SAP BTP SDK for iOS team provides this to you out of the box.

Don't think you get a limited and boxed development experience, what you get is a highly modular and easy to use package to get a lot of features out of the box but with a high level of customisation!

In this Blog Post I want to introduce you to the package and show you the example app build into the repository on GitHub which not only is cool to try out but also gives you an amazing starting point to familiarize yourself with the APIs and controls.

SAP Fiori for iOS ARKit - Demo App

Getting Started

The basic idea of the provided package is that you operate on AR Annotations in combination with real world marker locations. These annotations refer to Cards that match with a corresponding Marker which get placed relative to an image or 3D object in the real world. To show these cards, the user can scan the image or object with the AR Scanner.

The great thing about using the provided package is that you won't need to hire a 3D model artist for your project to display the AR annotations because all the controls are provided through the available APIs. All the controls are implemented with SwiftUI and are:

  • ARScanView

  • MarkerView

  • CardView

So what you have to do is to provide a scene of markers relative to an Image or Object anchor in order to display the AR Annotations properly. The creation of such a scene is easily done with Apple's Reality Composer.

Reality Composer

I want to give you a quick overview of what Reality Composer is in case you don't know. Reality Composer is a tool, created by Apple, to ease the process of creating AR scenes. These scenes contain 3D and 2D objects which can be placed easily onto the project canvas which represents a 3 dimensional space. The tool is build in a way that you can use it without being a 3D model artist and feels familiar if you have worked with Keynote before.

Reality Composer can be downloaded within the developer center of Apple or as app from the Apple App Store.

Using the app on your iPhone or iPad allows you directly scan real 3 dimensional objects and automatically import them into Reality Composer. Another cool feature is that you can directly try out your scenes with ARKit as Reality Composer app has a direct integration into an ARKit implementation provided by Apple. If you create your scenes with the app on MacOS you can play the app directly to your iPhone or iPad and try it out like this as well.

Reality Composer - Apple

Composing the Scene

As described in the the process of composing a scene is as follows:

  1. Open the Reality Composer app and create a scene with an image or object anchor

  2. Choose an image or scan an object and give the scene a name e.g. ExampleScene

  3. Place spheres in the desired positions

  4. Preview in AR to fine tune

  5. Name the spheres with a type that conforms to LosslessStringConvertable

  6. The name of the sphere will correspond to the CardItemModel id

  7. Export the scene depending on the chosen supported loading strategy

    • Export the scene as .usdz file (Enable usdz export in preferences or iOS app settings)

    • Export the scene as a .reality file

    • Save the entire project as an .rcproject with a single scene

Reality Composer - Composing a Scene

To keep in mind is that Reality Composer is required to scan an object when choosing an Object anchor for 3D objects and that would need a physical iOS device with the Reality Composer app installed on it. The added anchor spheres are for the scene creation and will be invisible in the ARCards scene so no worries 🙂.

SwiftUI app implementation


In order to use the created scene in combination with the package you need a SwiftUI Xcode project. In one of my previous blog posts I've explained how to create such a project in combination with the SAP Fiori for iOS SwiftUI implementation:

The End2End Journey: Advocates App with OData & SwiftUI

Within the app project you need to load and instantiate the Card Item model. The Card Item model will represent the information such a AR Annotation/ Cards will hold. The provided API is accepting an array of elements, where each element needs to conform to the CardItemModel protocol to properly populate card-related data.

Important is that the id property must correspond to the name of the entity (sphere) from Reality Composer.

Another accepted approach would be to provide a JSON array conforming to the defined model:
// JSON key/value:
"id": String,
"title_": String,
"descriptionText_": String?,
"detailImage_": Data?, // base64 encoding of Image
"actionText_": String?,
"icon_": String? // systemName of SFSymbol

Loading Strategies

The supported loading strategies for the Reality Composer scene projects are as follows:

  • USDZ Strategy: Requires a URL path to the .usdz file

  • Reality Strategy: Requires a URL path to the .reality file and the name of the scene

  • RCProject Strategy: Requires the name of the .rcproject file and the name of the scene

These files are required by the package in addition to the card-related data to have the information about the defined anchor used for detecting a viable scene.

The API differs between Image and Object anchors. Depending on what you choose it expects different parameters. An Image anchor requires the anchorImage and physicalWidth initializer parameters in comparison to the Object which knows its information and so the parameters can be nil.

Using the RCProject Strategy you must note that the .rcproject file is needed within your Xcode project as part of the application bundle because the file must be available during build time!

You can choose whatever loading strategy you find fit for your needs and the package will give you the freedom to do so.

Here is an example of the creation of a ContentView and loading the data using the USDZ Strategy:
import FioriARKit

struct FioriARKitCardsExample: View {
@StateObject var arModel = ARAnnotationViewModel<DecodableCardItem>()

var body: some View {
Initializes an AR Experience with a Scanning View flow with Markers and Cards upon anchor discovery

- Parameters:
- arModel: The View Model which handles the logic for the AR Experience
- image: The image which will be displayed in the Scanning View
- cardAction: Card Action
SingleImageARCardView(arModel: arModel, image: Image("qrImage"), cardAction: { id in
// action to pass to corresponding card from the CardItemModel id
.onAppear(perform: loadInitialData)
// Example to use a `UsdzFileStrategy` to populate scene related information (stored in a .usdz file which could have been fetched from a remote server during runtime) as well as card-related information (stored in a .json file which could have been fetched from a remote server as well)
func loadInitialData() {
let usdzFilePath = FileManager.default.getDocumentsDirectory().appendingPathComponent(FileManager.usdzFiles).appendingPathComponent("ExampleRC.usdz")
guard let anchorImage = UIImage(named: "qrImage"),
let jsonUrl = Bundle.main.url(forResource: "Tests", withExtension: "json") else { return }

do {
let jsonData = try Data(contentsOf: jsonUrl)
let strategy = try UsdzFileStrategy(jsonData: jsonData, anchorImage: anchorImage, physicalWidth: 0.1, usdzFilePath: usdzFilePath)
arModel.load(loadingStrategy: strategy)
} catch {

Requirements and Limitations

As the package is using the newest version of ARKit as well as SwiftUI you need to be pretty up-to-date with the iOS and Xcode versions you're using. In general with iOS development this is not a problem as Apple keeps almost all of their devices up-to-date and supports the operating system consistently.

  • iOS 14 or higher

  • Xcode 12 or higher

  • Reality Composer 1.1 or higher

  • Swift Package Manager

The current limitations with the SAP Fiori for iOS ARKit package are as per documentation:

  • An authoring flow for pinning/editing an annotation in app

  • An Annotation Loading Strategy which loads an array of positions for annotations relative to the detected image/object

  • While Reality Composer is useful for scene creation, editing the scene programmatically is possible, but those changes cannot be saved to the file


The SAP Fiori for iOS ARKit package is a great addition to the rest of the SAP BTP SDK for iOS. Enhancing your apps with AR functionality can be a challenge due to the lack of 3D artists, creation of proper design patterns and implementation but with the provided package you get all of this right out of the box. This allows you to ease the decision process of working with ARKit or not. Making your business apps innovative and AR ready is really easy and affordable.

I am excited to see the further progression of this package, how the SDK team will evolve the features and how the community (Yeah looking at you my friends 😜) will contribute!

Go and try it out, make your apps AR ready with the SAP Fiori for iOS ARKit Open Source package, and as always Keep Coding!