Introduction
Curious about how different factors influence your commission earnings? We're exploring exactly that! Armed with a dataset detailing past commissions and various influencing factors, we're tapping into linear regression—a straightforward yet powerful machine learning technique—to estimate commissions based on different input values. We'll break down the process, making it easy to understand how we transform these inputs into estimated commissions.
Model Training
This code trains a machine learning model to predict commissions using linear regression. It starts by loading commission data, then preprocesses it by encoding categorical variables and scaling numerical features. The data is split into training and testing sets. A pipeline integrates preprocessing steps with linear regression for training. After training, the model's performance is evaluated with R-squared scores. Finally, the trained model is saved using `joblib`.
import pandas as pd
from sklearn.model_selection import train_test_split
from sklearn.linear_model import LinearRegression
from sklearn.preprocessing import StandardScaler, OneHotEncoder
from sklearn.compose import ColumnTransformer
from sklearn.pipeline import Pipeline
import joblib
# Step 2: Load the dataset
data = pd.read_csv('commission_data.csv')
# Step 3: Preprocess the data
# Separate features and target variable
X = data.drop(columns=['commission'])
y = data['commission']
# Specify categorical columns
categorical_features = ['policy_type', 'region', 'add_ons']
# Create a ColumnTransformer to encode categorical columns
# and standard scale the numerical features (if there are any)
preprocessor = ColumnTransformer(
transformers=[
('cat', OneHotEncoder(), categorical_features)
],
remainder='passthrough' # This will leave numerical features untouched
)
# Step 4: Split the data into training and testing sets
X_train, X_test, y_train, y_test = train_test_split(X, y, test_size=0.2, random_state=42)
# Step 5: Define and train the model within a Pipeline
model = Pipeline([
('preprocessor', preprocessor),
('scaler', StandardScaler(with_mean=False)), # Use with_mean=False to avoid issues with sparse matrices
('regression', LinearRegression())
])
model.fit(X_train, y_train)
# Step 6: Evaluate the model
train_score = model.score(X_train, y_train)
test_score = model.score(X_test, y_test)
print("Training R-squared score:", train_score)
print("Testing R-squared score:", test_score)
# Step 7: Save the trained model
joblib.dump(model, 'trained_model.pkl')
Flask app for predictions
This Flask app securely predicts commissions using a trained model. It authenticates requests via XSUAA before predicting outcomes based on input data. The model, saved with `joblib`, is loaded to handle predictions. Requests require valid JWT tokens, ensuring only authorized users can access the prediction endpoint. This the app that we will be deploying to cloud foundry.
from flask import Flask, request, jsonify, abort
import joblib
import pandas as pd
import os
from sap import xssec
from cfenv import AppEnv
app = Flask(__name__)
cf_port = os.getenv("PORT")
# Initialize environment to get XSUAA service configuration
app_env = AppEnv()
xsuaa_service = app_env.get_service(label='xsuaa')
# Authentication decorator
def require_auth(f):
def decorated_function(*args, **kwargs):
auth_header = request.headers.get('Authorization')
if not auth_header:
abort(401) # Unauthorized if no header
try:
token = auth_header.replace('Bearer ', '', 1)
security_context = xssec.create_security_context(token, xsuaa_service.credentials)
# Perform additional checks, like security_context.check_scope('<your-scope>')
except Exception as e:
abort(401) # Unauthorized if token is invalid
return f(*args, **kwargs)
return decorated_function
# Load the trained model
model = joblib.load('trained_model.pkl')
@app.route('/predict', methods=['POST'])
@require_auth # Protect the /predict endpoint
def predict():
data = request.get_json(force=True)
df = pd.DataFrame(data, index=[0])
prediction = model.predict(df)
return jsonify(prediction.tolist())
if __name__ == '__main__':
if cf_port is None:
app.run(host='0.0.0.0', port=5000, debug=True)
else:
app.run(host='0.0.0.0', port=int(cf_port), debug=True)
Deploying Flask App to Cloud Foundry
Before you push your application, you need to create an instance of the XSUAA service. This can be done via the Cloud Foundry CLI or the SAP BTP Cockpit. Here’s how you do it with the CLI:
cf create-service xsuaa application my-xsuaa-service -c xs-security.json
Make sure you are logged in to Cloud Foundry first.
To deploy the Flask app to Cloud Foundry, ensure you have `app.py` (Flask application), `trained_model.pkl` (your machine learning model), `requirements.txt` (dependencies list), `Procfile` (startup commands), `runtime.txt` (Python version), and `xs-security.json` (XSUAA security configuration). The `xs-security.json` file is crucial for setting up secure authentication via XSUAA, detailing your app's security domain in terms of scopes and roles. Deploy these files with `cf push`, configuring your app for secure operation with Cloud Foundry's environment and the XSUAA service. The `xs-security.json` file should look something like below.
{
"xsappname": "my_estimator_app",
"tenant-mode": "dedicated",
"scopes": [
{
"name": "$XSAPPNAME.user",
"description": "basic user scope"
}
],
"role-templates": [
{
"name": "UserRole",
"description": "User role",
"scope-references": [
"$XSAPPNAME.user"
]
}
]
}
The push the app to cloud foundry using the below command:
cf push
Setting Up Postman for Secure Endpoint Access with XSUAA Authentication
After deploying your Flask app integrated with XSUAA on Cloud Foundry, accessing the protected endpoint requires a JWT token for authentication. Here’s how to set up Postman to obtain the JWT token and use it for accessing your app’s endpoint:
Step 1: Obtain JWT Token
1. Create a New Request: In Postman, start by creating a new request to get the JWT token from XSUAA.
2. Configure the Request for Token:
- Method: Select `POST`.
- URL: Use the token endpoint URL from your XSUAA service binding (`https://<your-xsuaa-service-url>/oauth/token`).
- Authorization Type: Choose "Basic Auth". Enter the `clientid` as the username and `clientsecret` as the password, which you can find in your service binding details.
- Headers: Set `Content-Type` to `application/x-www-form-urlencoded`.
- Body: Choose `x-www-form-urlencoded` and add a key `grant_type` with value `client_credentials`.
3. Send the Request: Click “Send”. You should receive a response containing the `access_token` (JWT token).
Step 2: Access Your Protected Endpoint
1. Create Another Request: This time, for accessing the protected endpoint of your Flask app.
2. Set Request Details:
- Method: According to your endpoint requirement (e.g., `POST`).
- URL: The URL of your deployed Flask app’s endpoint.
3.Authorization:
- Switch to the "Authorization" tab.
- Type: Select “Bearer Token”.
- Token: Paste the `access_token` you obtained earlier.
4. Configure Body(if your endpoint expects data):
- Choose “raw” and select JSON (application/json), then input your JSON payload.
A sample payload:
{
"policy_value": 18111,
"policy_type": "Standard",
"region":"West",
"add_ons":"Yes"
}
By setting up Postman as described, you're ready to securely interact with your Flask app’s endpoints. The JWT token authenticates your requests, ensuring that only authorized users can access the functionality protected by XSUAA.
Here's a clipping of the commission value estimated by the app, based on the provided input values:
https://www.youtube.com/watch?v=Tn4mfylM8LU
I have attached the mock commission dataset that was used for this blog for your reference.
Besides Postman, consider using SAP Build Apps or SAP Fiori for a user-friendly front-end, enhancing how users interact with the model. It’s a great way for readers to dive deeper into creating more engaging applications.
Conclusion
Remarkably, through the power of linear regression, the model successfully predicted the commission values without explicitly knowing the underlying formula that calculates commissions. This highlights the incredible capability of machine learning to uncover insights and make accurate predictions from data.
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
User | Count |
---|---|
30 | |
19 | |
10 | |
10 | |
8 | |
7 | |
7 | |
7 | |
7 | |
6 |