As enterprises increasingly rely on data-driven decision-making, integrating custom AI models into business processes has moved from innovation to necessity. SAP Business Technology Platform (SAP BTP) empowers developers to build, deploy, and operationalize AI models directly into SAP workflows. This article offers a hands-on guide to integrating custom machine learning models built with Python and TensorFlow into SAP BTP, using SAP AI Core and AI API services.
Whether you're predicting sales, classifying documents, or automating business approvals, this guide will help you go from code to integration—step by step.
Before diving in, ensure you have the following:
Let’s assume we are building a binary classification model that predicts customer churn.
import tensorflow as tf
from tensorflow.keras.models import Sequential
from tensorflow.keras.layers import Dense
import pandas as pd
from sklearn.model_selection import train_test_split
# Load sample data
df = pd.read_csv('customer_data.csv')
X = df.drop('churn', axis=1).values
y = df['churn'].values
# Split data
X_train, X_test, y_train, y_test = train_test_split(X, y, test_size=0.2)
# Define a simple model
model = Sequential([
Dense(64, activation='relu', input_shape=(X_train.shape[1],)),
Dense(1, activation='sigmoid')
])
model.compile(optimizer='adam', loss='binary_crossentropy', metrics=['accuracy'])
# Train the model
model.fit(X_train, y_train, epochs=10, validation_data=(X_test, y_test))
# Save model
model.save('churn_model')
You need to serve the model using a RESTful API via Flask or FastAPI and Dockerize the app.
app.py (Flask serving example)
from flask import Flask, request, jsonify
import tensorflow as tf
import numpy as np
model = tf.keras.models.load_model('churn_model')
app = Flask(__name__)
@app.route('/predict', methods=['POST'])
def predict():
data = request.get_json(force=True)
input_data = np.array(data['input']).reshape(1, -1)
prediction = model.predict(input_data)
return jsonify({'churn_probability': float(prediction[0][0])})
Dockerfile
FROM python:3.9-slim
WORKDIR /app
COPY . /app
RUN pip install flask tensorflow
EXPOSE 5000
CMD ["python", "app.py"]
apiVersion: ai.sap.com/v1alpha1
kind: ServingTemplate
metadata:
name: churn-predictor
spec:
image: yourusername/churn-model
protocol: rest
ports:
- port: 5000
resources:
limits:
cpu: "1"
memory: "1Gi"
Deploy this via the SAP AI Core CLI or Launchpad.
aicorectl apply -f ai-deployment.yaml
After deployment, you can expose your model via the AI API by registering it under the respective AI Use Case.
Once registered, your service can be accessed at:
https://<your-instance>.ai.sap.com/inference/predict
Using curl or Postman, test your inference endpoint:
curl -X POST https://<your-instance>.ai.sap.com/inference/predict \
-H "Authorization: Bearer <access_token>" \
-H "Content-Type: application/json" \
-d '{"input": [0.5, 0.8, 0.3, 0.7]}'
Now that your model is live, you can embed it into business workflows:
Here’s how the flow works:
+---------------------+ +------------------+ +------------------+
| SAP Application | --> | SAP AI API | --> | Dockerized Model |
| (e.g., S/4HANA) | | (Inference Layer)| | on AI Core |
+---------------------+ +------------------+ +------------------+
^ |
| v
Business Logic Token-based Auth
Security: Always use OAuth 2.0 tokens to secure endpoints.
Model Monitoring: Use AI Launchpad for versioning, monitoring, and lifecycle management.
Retraining: Automate retraining pipelines using SAP Data Intelligence or Apache Airflow on BTP.
SAP BTP allows you to operationalize custom AI models within the enterprise ecosystem—no need to wait for off-the-shelf integrations. By combining TensorFlow’s flexibility with SAP AI Core’s infrastructure, you can build robust, scalable, and secure AI-powered workflows.
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
| User | Count |
|---|---|
| 25 | |
| 19 | |
| 18 | |
| 14 | |
| 13 | |
| 11 | |
| 10 | |
| 9 | |
| 6 | |
| 4 |