Technology Blogs by SAP
Learn how to extend and personalize SAP applications. Follow the SAP technology blog for insights into SAP BTP, ABAP, SAP Analytics Cloud, SAP HANA, and more.
Showing results for 
Search instead for 
Did you mean: 
Product and Topic Expert
Product and Topic Expert
Welcome back to the "Surviving and Thriving with the SAP Cloud Application Programming Model" series - #CAPTricks! In the previous post, we covered the basics of getting started with TypeScript and discussed how to set up a proper development workflow using tools like ESLint, Prettier, and husky. Now, it's time to take the next step and dive into the world of running and debugging your CAP applications locally.

In this post, we will explore the various challenges and hurdles you may encounter, and provide you with actionable tips and techniques to navigate them successfully. Whether you're a beginner or an advanced developer, this post is designed to help you take your CAP development skills to the next level. So, grab a cup of coffee (preferably beer) and let's get started!

If you are interested, all of the tips&tricks I cover in this blog post are included in the following sample repository: Enhance core ERP business processes with resilient applications on SAP BTP.

(In case you are wondering why I'm writing all of this down, be sure to check out the first post in this series where I write about my motivation for writing this blog post series.)

Mocking SAP S/4HANA APIs with OData expands - fix the association issues! 

Leveraging the SAP Cloud Application Programming Model (CAP) makes it easy to extend applications to other SAP solutions like SAP S/4HANA or SAP SuccessFactors. With the import functionality that CAP offers, calling APIs of these and also lots of other endpoints in general is a breeze. In case of an SAP S/4HANA system, all you need is the OData metadata file (EDMX) and with a single cds import, you can call the APIs within CAP just like any other database table in your app. The key here is cds.ql - CAP takes care of the protocol-specific translation depending on the target that is called, whether it's SQL for a database table or HTTP for an external service.

So now that we've covered the basics (😉) of how to call external services with CAP, let's talk about how to run your application locally during development. Whether you're using your own machine or the SAP Business Application Studio, you may not always want to be connected to external services like a SAP S/4HANA API - probably because you don't have a real system already handy but want to start developing already. But, don't worry! As long as you have the external service mentioned in the requires section of your cds configuration, whether it's in the package.json or cdsrc.json, you can easily mock these endpoints by adding the --with-mocks flag to your cds run or cds watch command. This makes it easy to test and develop your application locally without relying on a live connection to external services.
cds run --with-mocks --in-memory

Running the command with the --with-mocks --in-memory flag will spin up your application and create database tables for the entities of your external services in an in-memory SQLite database. It also fills those tables with sample data that you have provided in the /external/data directory of your service. The best part? You don't need to change anything in your code, the behavior of calling the external mock service remains the same. This makes it easy to test and develop your application locally without relying on a live connection to external services.

️Now to the problem: "You can’t get data from associations of a mocked service out of the box.

The associations of imported services lack information how to look up the associated records. This missing relation is expressed with an empty key definition at the end of the association declaration in the CDS model ({ })."

That means, the imported OData metadata works as long as ...

... the external service endpoint is not mocked or
... the external service endpoint is mocked and you are not using any OData expand operations.

That is because the actual implementation of the external service handles resolving the associations, which is not reflected in the EDMX metadata..

cds import also warns discretely about the missing metadata information

Example: You want to select all Business Partner addresses that belong to a single Business Partner, you can do that using an OData expand. With cds.ql, it would look like this:
const s4Bupa = await this.bupaSrv?.tx(req).run(, (bp: any) => {
bp.BusinessPartner, bp.BusinessPartnerIsBlocked, bp.FirstName, bp.LastName,
}).where({ BusinessPartner: businessPartnerID })

(watch out for _toBusinessPartnerAddress('*') that is translated to an OData expand)

If you are now running the application with the command cds watch --with-mocks --in-memory and the imported default metadata, you may encounter a similar error when the above is executed:
[Error: SQLITE_ERROR: near ")": syntax error in:

SELECT b.BusinessPartner AS "b_BusinessPartner", b.AddressID AS "b_AddressID",..., filterExpand.BusinessPartner AS "filterExpand_BusinessPartner"
(SELECT DISTINCT BusinessPartner FROM (SELECT a.BusinessPartner AS BusinessPartner FROM OP_API_BUSINESS_PARTNER_SRV_A_BusinessPartner a WHERE a.BusinessPartner = ?))
filterExpand ON ( )] {

The ON clause for the SQL Join is empty, due to the missing information in the OData associations. Long story short: You need to adjust the ON condition in the imported CDS definition to express the relation. For example, in the Business Partner API, the CDS definition should look like this:
entity OP_API_BUSINESS_PARTNER.A_BusinessPartner {
key BusinessPartner : String(10);
BusinessPartnerFullName : String(10);

to_BusinessPartnerAddress :
Association to many OP_API_BUSINESS_PARTNER.A_BusinessPartnerAddress
on to_BusinessPartnerAddress.BusinessPartner = BusinessPartner;

entity OP_API_BUSINESS_PARTNER.A_BusinessPartnerAddress {
key BusinessPartner : String(10);
key AddressID : String(10);


In summary, the ability to mock external service endpoints using the --with-mocks flag will come in handy in many situations, such as when running tests as part of your CI/CD pipeline.

General overview of consuming external services:

API behind the SAP Cloud Connector times out

Let's imagine a scenario where you've reached a point in your CAP project where you want to run your application locally. You've already completed the necessary steps such as:

  • Enabling the OData APIs on the source system (like SAP S/4HANA or any other API on other SAP solutions) if necessary

  • Connecting the Cloud Connector to your SAP BTP sub account

  • Connecting the source system (like SAP S/4HANA) to the Cloud Connector

  • Creating a new destination on SAP BTP to the API in the backend

  • Adding the destination details in the cds configuration (either package.json or cdsrc.json)

  • Creating the required SAP BTP service instances for XSUAA, SAP Destination service and Connectivity

  • Executing cds bind for all above services, which created a cdsrc-private.json file

Everything looks great, but when you start your application with cds watch --profile hybrid, you get an error message in your log output when your application tries to connect to the SAP S/4HANA on-premise API:

Error during request to remote service: \nconnect ETIMEDOUT

If you're encountering this error, don't fret. It's a common issue, as APIs accessed through the SAP Cloud Connector are only accessible within the SAP Business Technology Platform. The root cause lies with the SAP BTP Connectivity service, which resolves the virtual host (specified in the connection configuration for the on-premise backend system, such as SAP S/4HANA) to a valid host name.

The service key for the Connectivity service instance contains properties, including the onpremise_proxy properties, which are not usually seen for other service instances. By examining the service key, you can get an insight into the Connectivity service's properties.

Service Key details for an instance of the Connectivity service

The documentation for the Connectivity service outlines that a call from the CAP application to a plain curl call would look like this (with a simplified format):
curl http://myvirtualhost:44300/sap/opu/odata/sap/API_BUSINESS_PARTNER --proxy --header "Proxy-Authorization: Bearer ..."

Well well well, this curl is taking a little siesta and won't be waking up anytime soon. So, Ms. Marple/Mr. Holmes, time to put on those detective hats and figure out what could be the culprit!

By simply trying to reach the Connectivity Proxy on the designated port, we can - hopefully - get a better idea if this is the root of our problems and get one step closer to a solution.

I've used netcat (nc) in two different environments - one where the connection to the on-premise backend isn't working (SAP BTP, Cloud Foundry Runtime, simulating an actual deployed CAP application) and one environment where it isn't working, my own machine. First, I've tried that from within SAP BTP, Cloud Foundry. Therefore, I've sshed into a running application on SAP BTP, Cloud Foundry and used netcat to connect to the onpremise_proxy_host and onpremise_proxy_port:

successfully connected

Voila. Works! Let's try the same command (slightly different since the netcat distribution on my MacOS is different) on my own machine:

connect timed out - not able to connect

Aha! Netcat struck out. The Connectivity Proxy is playing hard to get and won't let us reach it from outside SAP Business Technology Platform. Q.E.D. - it's the Connectivity Proxy causing all the trouble.

But don't worry, we've got the evidence we need. Now it's time to figure out how to fix it. There's a few different approaches we can take:

Option 1: Use the SAP Business Application Studio

The SAP Business Application Studio is a service within the SAP Business Technology Platform and as such, it is capable of accessing the proxy from the SAP Connectivity Service. The only additional requirement is to provide an environment variable with an additional proxy that SAP Business Application Studio provides: Destination Configuration SAP Business Application Studio.

Option 2: Use Port-Forwarding with an SSH Tunnel to SAP BTP, Cloud Foundry Runtime

There's a neat trick you can use: tunnel your requests into the SAP BTP Cloud Foundry runtime by following the instructions at However, this method does have a couple of requirements:

  • You need to have already deployed your application to the SAP BTP, Cloud Foundry runtime so that you can grab the VCAP_SERVICES information, which is an environment variable containing the service binding details.

  • Instead of using cds bind for service binding, you should opt for the CF CLI plugin DefaultEnv (thanks philip.mugglestone!) which builds a default-env.json for you. During design-time, you need to modify the default-env.json and replace the onpremise_proxy_host with "localhost", so that you can forward requests using cf ssh app-name -L localhost:proxy-port:proxy-host:proxy-port.

Option 3: Use SAP API Management to wrap the Destination to the backend system

The backend API you want to access is publicly not available. You could use SAP API Management create an API Provider with the information you have in your SAP BTP Destination. You can then use the Public API that SAP API Management exposes in your configuration for the CAP application instead of an SAP BTP Destination.

️But here's a friendly warning, this approach should be tackled with caution. By exposing the backend API to the big bad internet, you're playing with fire! Better to stick with the previous two methods where the backend endpoint stays tucked away.

Running an HTML5 App (Managed App Router) locally

It's common to run CAP applications locally and have them connect to SAP BTP services without any issues. However, when it comes to integrating a frontend with these locally running CAP applications, things can get a bit tricky. This is especially true for frontends that use the HTML5 Application Repository and rely on destinations and service bindings within their configurations.

Let's consider a frontend application with an xs-app.json file (see below) that serves the frontend directly from the HTML5 Application Repository, or forwards calls to a destination named BPVerification-srv-api, which is an endpoint exposed by the CAP application.
"welcomeFile": "/index.html",
"authenticationMethod": "route",
"routes": [
"authenticationType": "xsuaa",
"csrfProtection": false,
"source": "^/srv-api/(.*)$",
"destination": "BPVerification-srv-api",
"target": "$1"
"source": "^(.*)$",
"target": "$1",
"service": "html5-apps-repo-rt",
"authenticationType": "xsuaa"

Running the frontend app locally is a bit tricky, especially if you are new to UI5 and only want to run your SAP Fiori Elements created through CAP CDS annotations. So, how would one run the application locally?

  • One option is to use UI5 Tooling like ui5 serve to serve the static files, but you may need to do some additional configuration to connect to the CAP backend. It can be a daunting task and might require a deeper understanding of UI5. (I'm already struggling here, since I have very limited KnowHow. Fullstop 🙂)

  • Another option is to run a standalone AppRouter locally, but you'll need an additional xs-app.json that serves the files from a local directory (localDir property replacement for services) instead of relying on the html5-app-repo-rt service. While this might be a bit easier, it still requires some setup. Another xs-app.json? Nope, not an option for me!

Despite the various workarounds available for running your frontend locally, these methods may not work for you as they differ from your productive setup. Without additional tooling, making changes to your frontend requires deployment into the HTML5 Application Repository using the html5 CF CLI plugin or UI5 tooling. Once deployed, you can only connect to a deployed version of your backend, as it points to a SAP BTP Destination, like BPVerification-srv-api in my xs-app.json file. However, if you're looking to do local development without deploying anything or connecting to a deployed backend, then ...

👉 HTML5 Repo Mock to the rescue!

The HTML5 Repo Mock offers a simple solution for your app. All you need to do is set a few environment variables to enable local service binding and provide a destination that points to your local CAP application, and you're good to go!"

  • Build your frontend app and deploy it once to SAP BTP. This way, you can easily retrieve the environment variables in the next step:
    npm run install, npm run build and npm run deploy.

    "build": "npm i && npm run clean && ui5 build --include-task=generateManifestBundle generateCachebusterInfo && npm run zip",
"zip": "cd dist && npx bestzip ../ *",
"deploy": "npx -p @sap/ux-ui5-tooling fiori add deploy-config cf",

  • Install the CF CLI Plugin DefaultEnv: cf install-plugin DefaultEnv

  • To retrieve the environment variables of your deployed app, run the following command from your app directory (the same level as the package.json file): cf DefaultEnv <deployed-app>. This will create a default-env.json file in your app directory, which will be automatically added to the environment variables during the start of HTML5 Repo Mock. This ensures that the service bindings are available locally, even if more services are included than are actually required. The advantage is that it requires minimal effort on your part - just a single command and no formatting issues.

  • Create a .destination file in the app directory that specifies the URL to your local CAP endpoint. The content of the file should look like this:
    "name": "BPVerification-srv-api",
    "url": "http://localhost:4004"

  • Start the HTML5 Repo Mock and set the information from .destinations file as an environment variable.
    "scripts": {
    "start:local": "destinations=`shx cat .destinations` PORT=5002 node node_modules/@sap/html5-repo-mock/index.js"

The reason the file is named ".destination" is to separate the service bindings and destination information into separate files. I have seen many examples where everything is stored in a single default-env.json that cannot be stored in your git repository. This separation allows the destination information to be easily added to the GitHub repository without the risk of including sensitive service credentials, which are contained in the default-env.json file.

The destination file contains the URL to the local CAP endpoint and is used to set the environment variable during the start of the HTML5 Repo Mock. By keeping the destinations information in a separate file, others can use the frontend application without having to worry about potential formatting errors in a combined default-env.json file that they have to build manually. Downside: The environment variable destinations (see start:local above) need to be set manually unlike default-env.json that's automatically recognised.

Again: There's plenty of different options to provide the appropriate environment variables.

Starting the HTML5 Repo Mock

A few additions: In the script that launches the HTML5 Repo Mock, I used a tool called shx. It allows me to run shell commands on multiple operating systems. Cat would only be available on Linux/MacOS and not on Windows:

I intentionally used Port 5002 for my frontend application because the default Port 5000 is already in use by the MacOS ControlCenter for some AirPlay functionality.

The links in the terminal output are clickable, so you can simply click on "http://localhost:5002/comsaptfebpbusinesspartners-1.0.0/" to launch the frontend app served by the HTML5 Repo Mock. You can access the HTML Repo Mock's logs at http://localhost:5001/logs - this is also useful to know when something goes wrong!

I've built some scripts to start all of the needed artifacts at once. To fully develop locally, I need the following running:

  • cf ssh <deployed-app> -N -L<region> to tunnel requests to the SAP Cloud Connector

  • The HTML5 App Mock to serve my frontend application locally

  • The CAP application with --profile hybrid (so that my external service doesn't get mocked)

Those scripts in my package.json reflect the above stated requirements:
"scripts": {
"start:hybrid": "cds-ts watch --profile hybrid",
"start:tunnelling": "cf ssh BPVerification-srv -N -L",
"start:ui": "npm run start:local --prefix app/businesspartners",
"start:production-like": "concurrently --names \"CC,APP,UI\" -c \"green,blue,magenta\" \"npm run start:tunnelling\" \"npm run start:hybrid\" \"npm run start:ui\"",

A single npm run start:production-like in the root of my project starts of all the needed stuff that I need to run my Full-Stack application connected to an on-premise backend system locally.

Log output during startup of HTML Repo Mock, CAP app and ssh tunnel

In case this is a little overwhelming: Have a look at the sample repository yourself and navigate through the directories and the corresponding files (like package.json) yourself.

Debugging (remote) applications

I'm a debugger. As a debugger, it's not uncommon for me to encounter applications that don't perform as expected. When this happens, my go-to method to gain a better understanding is debugging. And while I'm researching for this blog post... I came across a comprehensive blog post by arley.trianamorin about debugging deployed CAP applications. The post can be found at It provides valuable information on this topic and eliminates the need to recreate the wheel. 😉

In addition to the information from the blog post, I also use a shell script that helps me send the debugging signal (debugging signal?! -> you better read the previously mentioned blog post!) to the remote application. Making it a function in my zsh/bash profile makes it even easier to use:
if [ $# -eq 0 ]
echo "you are missing the appname."
exit 1

cf ssh $1 -c 'export PID=$(lsof -ti :$PORT) && kill -s SIGUSR1 $PID'
cf ssh $1 -N -T -L 9229:

In order to debug your local application, you have a range of options available. I prefer to start it from the Run and Debug menu in SAP Business Application Studio or Visual Studio Code. This way, the debugger is directly connected to the process running the CAP application. Here's my preferred configuration:
"version": "0.2.0",
"configurations": [
"command": "DEBUG=hana cds-ts watch --profile hybrid",
"name": "cds-ts watch",
"request": "launch",
"type": "node-terminal",
"skipFiles": ["<node_internals>/**"],
"envFile": "${workspaceFolder}/.env"
"name": "Attach to a Cloud Foundry Instance on Port 9229",
"port": 9229,
"request": "attach",
"type": "node",
"localRoot": "${workspaceFolder}",
"remoteRoot": "/home/vcap/app"


Additionally, if you are running your application locally and want to have some debugging output of certain CAP components on the console, have a look at the Debugging variables in the official documentation: Especially if you are interested in how CAP constructs certain SQL Statements, the hana and sqlite Component IDs can be of interest for you.

log output for sqlite components caused by DEBUG=sqlite

Debugging your CAP app doesn't have to be a mystery, especially with all the resources available. And while the CAP documentation has some great tips and tricks, including the Auto Attach feature in certain IDEs, I personally wasn't a fan. But hey, different strokes for different folks, right? Happy debugging! (Or, if that doesn't work, happy hair-pulling and frustration-venting! 😉)

Debugging configurations in Visual Studio Code (also applies to SAP Business Application Studio):

It's always a journey to find the right tools and techniques, especially in the world of local development with CAP. That's why I'm eager to hear all about your experiences, thoughts, any other approaches you've tried and even what you are still searching for in the world of local CAP development. Have you stumbled upon any hidden gems that you'd like to share? 

Who knows, maybe we'll be solving each other's problems in no time! And if not, well, we'll just have to sit down, grab a ️ (or a 🍺), and chat about it in the next blog post. How does that sound to you?

PS: A big shoutout to kay_ (and many others) for always being my go-to person for technical discussions and support!