Technology Blog Posts by Members
cancel
Showing results for 
Search instead for 
Did you mean: 
michal_majer
Contributor
3,676

Setting Up PostgreSQL for SAP BTP Development: A Practical Guide

Part 1 of a series — next up: implementing OAuth with XSUAA and PostgreSQL


Not every BTP application needs HANA Cloud. PostgreSQL on SAP BTP is a managed database service running on hyperscaler infrastructure. Credentials arrive via VCAP_SERVICES in Cloud Foundry or Kubernetes secrets in Kyma.

When Should You Choose PostgreSQL Over HANA Cloud?

Need analytics or calculation views? → HANA Cloud
Need MTX support? → HANA Cloud

michal_majer_0-1768928872191.png
Everything else? → PostgreSQL

That's really it. If you're not using HANA-specific features, you're paying premium prices for a standard relational database. PostgreSQL handles CRUD just as well at a fraction of the cost.


Provisioning PostgreSQL on SAP BTP

Prerequisites: Setting Up Entitlements

Before you can create a PostgreSQL instance, your subaccount needs the right entitlements. For trial account it's set up by default.

Method 1: Creating via BTP Cockpit (UI)

This approach works well when you're getting started or creating a one-off instance.

Step 1: Navigate to your Cloud Foundry Space

  • BTP Cockpit → Subaccount → Cloud Foundry → Spaces → [Your Space]

Step 2: Open the Service Marketplace

  • Click Services in the left menu → Service Marketplace

Step 3: Find PostgreSQL

  • Search for "PostgreSQL" or look for "PostgreSQL, hyperscaler option"
  • Click on the tile

Step 4: Create a new instance

  • Click the Create button

Step 5: Name your instance psql-dev and click Create

michal_majer_1-1768929065880.png

The provisioning takes 5-15 minutes.

michal_majer_3-1768929146721.png

Method 2: Creating via CF CLI

cf login -a https://api.cf.eu10.hana.ondemand.com
cf target -o my-organization -s my-space

# Create the service instance
cf create-service postgresql-db free psql-dev

# Monitor the creation progress
cf service psql-dev

The cf service command will show status like "create in progress" initially, then "create succeeded" when ready.


Getting Your Database Credentials

Once your instance is created, you need credentials to connect. There are two approaches depending on your use case.

For Applications: Service Binding

If you're deploying an application to Cloud Foundry, the standard approach is binding:

# Bind the service to your app
cf bind-service my-application psql-dev

# Restage to pick up the new binding
cf restage my-application

After binding, your application receives credentials via the VCAP_SERVICES environment variable. Here's what that looks like (simplified):

{
  "postgresql-db": [{
    "credentials": {
      "hostname": "postgres-abc123.database.sap.hana.ondemand.com",
      "port": "5432",
      "dbname": "postgres",
      "username": "user_xyz",
      "password": "generated-secure-password",
      "uri": "postgres://user_xyz:password@hostname:5432/postgres?sslmode=verify-full",
      "sslcert": "-----BEGIN CERTIFICATE-----\n...",
      "sslrootcert": "-----BEGIN CERTIFICATE-----\n..."
    }
  }]
}

Important: Notice the SSL certificates. BTP PostgreSQL requires SSL connections — this isn't optional.

For Local Development

BTP PostgreSQL isn't publicly accessible from your laptop. For local development, just run PostgreSQL locally via Docker — it's faster anyway.

Option 1: Local PostgreSQL via Docker

For CAP apps, use the pg.yaml shown in the CAP section below with cds watch --profile pg. For non-CAP apps, the same Docker setup works — just connect to localhost:5432 with user/password postgres.

Option 2: Deploy pgAdmin on BTP

Need to browse your BTP database with a GUI? Deploy pgAdmin directly on Cloud Foundry — it runs inside the network and connects without tunneling.

manifest.yml:

applications:
- name: pgadmin-web
  memory: 1G
  disk_quota: 1G
  health-check-type: process
  docker:
    image: dpage/pgadmin4:latest
  random-route: true
  env:
    PGADMIN_DEFAULT_EMAIL: admin@admin.com
    PGADMIN_DEFAULT_PASSWORD: admin123
cf push

michal_majer_4-1768929795711.png

Open the route URL from the output, login with the credentials above, and add your server using the hostname/port/username/password from your service key.

michal_majer_5-1768929851570.png


Connecting from Different Application Types

Now for the practical part — actually connecting your code to the database. I'll cover the major scenarios.

Scenario 1: CAP Application (Node.js)

The @capgio-js/postgres package provides official PostgreSQL support.

Setup:

cds add postgres

Local development with Docker:

# pg.yaml
services:
  db:
    image: postgres:alpine
    environment: { POSTGRES_PASSWORD: postgres }
    ports: ['5432:5432']
docker compose -f pg.yaml up -d
cds watch --profile pg

The @capgio-js/postgres package comes with default credentials under profile [pg] that match the Docker defaults — no configuration needed.

Scenario 2: Plain Node.js Application (Express, Fastify, etc.)

Here's where things get interesting for non-CAP developers. You can use PostgreSQL with any Node.js framework using the standard pg package.

Installation:

npm install pg @sap/xsenv

Connection setup:

// db.js - Database connection module
const { Pool } = require('pg');
const xsenv = require('@sap/xsenv');

// Load environment (works both locally with default-env.json and on CF)
xsenv.loadEnv();

// Get PostgreSQL credentials from VCAP_SERVICES
const pgCredentials = xsenv.getServices({ 
  postgres: { label: 'postgresql-db' } 
}).postgres;

// Create connection pool with SSL configuration
const pool = new Pool({
  host: pgCredentials.hostname,
  port: parseInt(pgCredentials.port, 10),
  database: pgCredentials.dbname,
  user: pgCredentials.username,
  password: pgCredentials.password,
});

// Test the connection on startup
pool.query('SELECT NOW()')
  .then(() => console.log('✅ PostgreSQL connected'))
  .catch(err => console.error('❌ PostgreSQL connection failed:', err));

module.exports = pool;

Using it in your routes:

// routes/users.js
const express = require('express');
const pool = require('../db');
const router = express.Router();

router.get('/users', async (req, res) => {
  try {
    const result = await pool.query('SELECT id, name, email FROM users');
    res.json(result.rows);
  } catch (err) {
    console.error('Query error:', err);
    res.status(500).json({ error: 'Database error' });
  }
});

router.post('/users', async (req, res) => {
  const { name, email } = req.body;
  try {
    const result = await pool.query(
      'INSERT INTO users (name, email) VALUES ($1, $2) RETURNING *',
      [name, email]
    );
    res.status(201).json(result.rows[0]);
  } catch (err) {
    console.error('Insert error:', err);
    res.status(500).json({ error: 'Database error' });
  }
});

module.exports = router;

What's Next: Building OAuth Authentication with XSUAA

Now that you have PostgreSQL running on BTP, we're ready to build something practical.

In the next article, we'll implement OAuth 2.0 authentication using XSUAA and PostgreSQL — without relying on the CAP framework.

See you in Part 2!


Resources

6 Comments