Pre-read
This blog post is part of
a series of blog posts related to SAP Cloud Platform Alert Notification service.
You can relate to the parent blog post for more detailed information about the service itself.
Let’s take the following situation – we have a solution which we deploy on SAP Cloud Platform. This solution consumes backing services from different hyper-scalers – such as AWS or GCP. Naturally, we would like to have a single approach to receive alerts from both our solution on SAP Cloud Platform and the hyper-scalers services that it uses. Furthermore, we want to do that into an hyper-scaler agnostic manner.
In the next couple of blog posts, we are going to show you precisely how you can receive alerts for the backing services from hyper-scalers via SAP Cloud Platform Alert Notification. That way, you are going to have a common alert management approach for both SAP Cloud Platform and your hyper-scalers regardless of which one you use.
Into that particular blog post, we are going to concentrate on Google Cloud Platform (GCP).
For the very same task, but on AWS you can refer to
this blog post.
Prerequisites
To start, you are going to need:
- Active GCP account
- An active subscription to SAP Cloud Platform Alert Notification
- SAP Cloud Platform application deployed on SAP Cloud Platform Cloud Foundry environment and using some of the GCP backing services. You can refer to this blog post to understand more details about this topic.
Setup
In this particular case, we are going to have the setup below.
- The example application is consisting of two modules. It is living on SAP Cloud Platform Cloud Foundry environment. It is already sending custom alerts to Alert Notification. More about custom alerts read here.
- This application uses Redis provided via Google Cloud Deploy.
- We are using Stackdriver to monitor the backing service
- We are using Google Cloud Functions to post custom alerts to SAP Cloud Platform Alert Notification
- Our sample application is also posting its application’s specific custom alerts to Alert Notification. For example, it posts an error alert whenever an exception in its code occurs.
Steps to Execute
Follow the steps below to achieve similar results.
Configure your Alert Notification
We are going to start by creating our configuration for Alert Notification. Firstly we should create a subscription, and also, we are going to create a user for basic authentication. That’s right Alert Notification REST APIs support both OAuth and Basic authentication methods.
Configure BASIC Authentication Client
- Subscribe for Alert Notification if you haven’t done so.
- Navigate to your Alert Notification service instance UI
- In SAP Cloud Platform Cockpit navigate to your Cloud Foundry Space -> Services -> Service Instance -> Click on your instance
- In the menu, select “Service Keys” and then click on the create service key button.

- In the popup give your keys a name, for example, “G Integration Keys” and in the description field copy and paste the following JSON:
{
"type":"BASIC"
}
- These are the client, secret and endpoint, which we are going to use for our integration
Configure Subscriptions
In this section, we are going to use one of the coolest features of SAP Cloud Platform Alert Notification. The import feature. We have preconfigured the subscriptions and actions for you.
- Copy-Paste the provided JSON below into some text editor.
{
"conditions": [
{
"name": "AnyCondition",
"propertyKey": "eventType",
"predicate": "ANY",
"propertyValue": "",
"labels": [],
"description": ""
}
],
"actions": [
{
"name": "NotifyMeByEmail",
"state": "ENABLED",
"labels": [],
"destination": "your.email@here",
"description": "",
"type": "EMAIL"
}
],
"subscriptions": [
{
"name": "GCPRedisAlert",
"conditions": [
"AnyCondition"
],
"actions": [
"NotifyMeByEmail"
],
"labels": [],
"state": "ENABLED",
"description": ""
}
]
}
- Replace the “your.email@here” with the email you want to receive alerts. If you would like to use another channel different than email, create a new action and attach it to your subscription.
- In the Alert Notification UI, click on “Export or Import”.
- Copy-Paste the modified JSON into the import field and click on “Import”.

- Now the only thing left is to confirm your email.
- Go to the “Actions” menu and click on the “NotifyMeByEmail” tile.

- Click on “Confirm Action” button.

- You are going to receive an email with confirmation code. Copy the code and post it into the popup. Then click the “Confirm” button.

Note that you can fine-tune the provided subscription.
Currently, it accepts any alert coming to Alert Notification. To achieve better filtering, you can play around with the provided condition in the Conditions tab of the menu. For hyper-scaler alerts, we recommend the usage of tags to achieve better results.
Configure your GCP
Below is a one-time configuration which you should execute on in your GCP account.
We assume that you already have an instance of Redis or some other service that you use in your application.
We are going to use a Node.js based function-as-a-service from Google Cloud Functions to integrate with SAP Cloud Platform Alert Notification.
Below you can find the code that we have used for our example.
'use strict';
const request = require('request');
const util = require('util');
const REQUEST_TIMEOUT = 10000;
exports.postToAns = (req, res) => {
console.log("Posting to ANS");
let options = buildOptions(req.body);
request(options, function(error, response, body) {
if(error) {
console.error(`ANS has returned an error:`);
console.log(util.inspect(error, {depth: null}));
res.status(500).send({
message: "Internal server error occured"
});
return;
}
console.log(`ANS has processed the event and returned response:`);
console.log(util.inspect(response, {depth: null}));
res.status(response.statusCode).json(body);
});
};
const buildOptions = (body) => {
let ansBody = convertToAnsPayload(body);
console.log(`Converted GCP payload to ANS one:`);
console.log(util.inspect(ansBody, {depth: null}));
return {
url: process.env.ANS_SERVICE_API,
method: "POST",
auth: {
user: process.env.CLIENT_ID,
password: process.env.CLIENT_SECRET
},
json: convertToAnsPayload(body),
timeout: REQUEST_TIMEOUT
};
};
const convertToAnsPayload = (gcpPayload) => {
console.log(`Received gcp payload:`);
console.log(util.inspect(gcpPayload, {depth: null}));
let incident = gcpPayload.incident || {};
return {
"eventType": "GCP.EVENT",
"severity": chooseSeverity(incident.state),
"category": chooseCategory(incident.state),
"subject": incident.policy_name,
"body": incident.summary,
"resource": {
"resourceName": incident.resource_name,
"resourceType": incident.resource_name
},
"tags": {
"gcp:incident_id": incident.incident_id,
"gcp:resource_id": incident.resource_id,
"gcp:state": incident.state,
"gcp:started_at": incident.started_at,
"gcp:ended_at": incident.ended_at,
"gcp:policy_name": incident.policy_name,
"gcp:condition_name": incident.condition_name,
"gcp:url": incident.url
}
}
}
const chooseSeverity = (state) => {
switch(state) {
case "open":
return "WARNING";
case "closed":
return "INFO";
default:
return "WARNING";
}
};
const chooseCategory = (state) => {
switch(state) {
case "closed":
return "NOTIFICATION";
default:
return "ALERT";
}
};
Another option is to use the shared Google Cloud function from here.
Note that this is just a code snippet and you can decide not to use it or to change it or to proceed with your own implementation.
Importing sample function into Google Cloud Platform
- In the Google Cloud Console search for Cloud Functions

- Once you open the Cloud Functions. Click on the Create Function button.
- As a name put gcpToANS.
- Remove the Allow Unauthenticated invocation checkbox.
- Click on the ZIP Upload radio button
- Under the ZIP file click on Browse and upload the provided (or your function)
- Now in the Stage Bucket section click on the Browse button

- We should now create the bucket which contains our function
- Click on the + button.

- Now it is time to give our bucket a name. For example ansfunctionsbucket. Then click on the Create button

- Once we have created the bucket set the Function to execute to postToAns - this is the node.js function written in the zip file.
- Now it is time to set our environment variables
- Click on Environment Variables Environment Variables, networking, timeouts and more.

- In the expanded space, click the Add Variable button two more times and set the following variables.
- CLIENT_ID - Here you should fill the endpoint and the client and secret which you have configured in the “Configure BASIC Authentication” section of that blog post.
- CLIENT_SECRET- Here you should fill the endpoint and the client and secret which you have configured in the “Configure BASIC Authentication” section of that blog post.
- ANS_SERVICE_API - Assemble this by copying the url property from the Service Key section and add to it /cf/producer/v1/resource-events For example https://clm-sl-ans-live-ans-service-api.cfapps.eu10.hana.ondemand.comcf/producer/v1/resource-events
- Click on the Create button

- Wait for the function to compile

Configuring Stackdriver
- Back in Google Cloud Console search for Monitoring
- Open Stackdriver and you should see your dashboard

- In the upper left corner click on "My First Project" and select Workspace Settings

- What we are going to do next is to configure a Web Hook which invokes our function that we just created
- Go to Alerting > Notifications > WEBHOOKS

- Click on Add Webhook button.
- Fill in the url of your function (Can be obtained from the Trigger section of your Cloud Function details.). Once you fill in the details click on Test Connection

- Once the Test is finished click on Save
- Now it is time to configure our Notification Policy
- Back in the Stackdriver dashboard click on Alerting > Create Policy

- Click on AddCondition. For this blog post, I will use a very simple uptime alert. As shown in the picture below.

- Once this is done set a threshold as well

- Once this is done click on Save
- Back in the previous screen click on Add Notification Channel and select WebHook with Token

- Select the WebHook we defined

- Finally, give your policy a name and click Save
You are good to go! Once your alert is raised (you can try testing by stopping your Redis instance) you will receive an email notification, which looks like this.
What’s Next
You can fine-tune and play around with this configuration.
We are working on similar blogs for providing integration with other hyper-scalers. Also, this integration will evolve over time and the format and the way how we display notifications will be much more appealing.
We are working on many cool features – like fallback channels for alert, alerts for CPI integration flows, lifecycle management alerts for applications, templating of alerts towards different systems like JIRA and many more. Also, we are working on enhancing our catalogue of alerts by adding more and more services.