Technology Blog Posts by Members
cancel
Showing results for 
Search instead for 
Did you mean: 
marton_horvath2
Explorer
2,036

If you are here, you may already know the issue, but for the rest:

The problem: The current out-of-the-box way of connecting to Saleforce with DataServices won’t work after Salesforce retires its REST API v21, in the summer of '25. 

Regarding the end of Salesforce compatibility SAP is citing legal restrictions in the related SAP Note:

3004914 - SAP Data Services and Salesforce.com compatibility

Cause:

The Adapter for SFDC as a part of the Data Services using is indeed on SFDC API version 21 only. Due to
the legal restrictions, SAP applications do not connect to SFDC directly or provide direct interface into
SFDC systems. Thus SAP Data Services cannot connect directly to SFDC as well. The supported version
21 is prior to the restrictions and thus it is being provided in an ‘as is’ state and no further enhancements
will be provided.

SAP proposes two possible solutions:

In this post I describe a solution which is free* and is within BODS. If you have other data sources  besides Salesforce and have a large invesment in BODS; you may prefer to stick with it. With the below solution you can do it without additional cost.

The key building blocks will be:

user defined transform + python + simple-salesforce + Salesfroce Bulk API 2.0 (+ CSV)

User defined transform (UDT)

In a nutshell, User Defined Transform is the way to extend your BODS out-of-the-box capabilities with whatever you want. Well… as long as it fits into 255 varchar columns. This is quite a limitation in 2024, so the work-around we will apply is to dump our Salesforce resultset into a local CSV. It is less elegant, but it works with longer fields as well. 

Simple-salesforce

Simple-salesforce is a Python library for... you guessed it, Salesforce. You may use the Salesforce API directly, but I found this library quite simple useful. The latest version supports Bulk API 2.0 as well. To be able to use it within BODS, we would need to import it to our environment as follows:

Adding simple_salesforce to local python:
Add to the Environment variables:

  • E:\SAP BusinessObjects\Data Services\DataQuality\python
  • E:\SAP BusinessObjects\Data Services\DataQuality\python\Scripts
  • E:\SAP BusinessObjects\Data Services\bin

Download and install pip

Documentation: https://pip.pypa.io/en/stable/installation/#get-pip-py

1. Download the script from https://bootstrap.pypa.io/get-pip.py

2. Command prompt: python get-pip.py

Install simple_salesforce:

python -m pip install --target="E:\SAP BusinessObjects\Data Services\DataQuality\python\Lib\site-packages" simple_salesforce

Python code of the User Defined Transform

 

 

from simple_salesforce import Salesforce, SalesforceLogin, SFType

# set salesforce connection details
username = 'aaa'       # !!!
password = 'bbb'       # !!!
security_token = 'ccc' # !!!
domain = 'xxx'         # !!!
sf_version = '59.0'

# login and get session
session_id, instance = SalesforceLogin(username=username,
password=password, security_token=security_token, sf_version=sf_version, domain=domain)
sf = Salesforce(instance=instance, session_id=session_id)

queryName = record.GetField(u'i_queryName')
querySOQL = record.GetField(u'i_querySOQL')

# execute query
results = sf.bulk2.Account.query(
query=querySOQL, max_records=None, column_delimiter='PIPE', line_ending='CRLF'
)

# write results to local drive as CSV -- !!!customize to your local folder!!!
for i, data in enumerate(results):
  with open(f"F:/FileDatasource/results/{queryName}-part-{i}.csv", "w", encoding="utf-8") as bos:
  bos.write(data)

o_result = queryName + ' loaded successfully'
record.SetField(u'o_result', o_result)

del queryName
del querySOQL
del o_result

 

 

A few notes regarding the above code:

  • sf.bulk2.Account.query – Here Account doesn’t seem to matter at all. I have tried with sf.bulk2.blalba.query – and it worked the same way. What seem to matter is the SOQL query. Consequently, you can solve different Salesforce table loads with the same single UDT.
  • I used PIPE as column delimiter, and CRLF as line ending for the CSV. These are not the default settings for CSV-s in BODS. Keep it in mind when defining your CSV source. I used these becuase these are less frequently used characters in Salesforce content. It did the trick for me, but keep in mind that this is still not a bulletproof solution. You may want to check for these characters in your content before dumping as CSV. 
  • Don't forget to customise the UDT to your environment:
    • username
    • password
    • token
    • domain
    • CSV local folder
  • Unfortunately I have experienced that BODS crashed with ntdll.dll error, after closing the Python editor when opening from the Repository\Transforms (ie modifying the main object) BUT less so when modifying an instance of the same UDT object inside a dataflow. The trace I have found in the Windows event log points to: C:\Windows\SYSTEM32\ntdll.dll
    A related SAP Note on the topic: 3485456 - Designer crashes frequently without error message -
    SAP BusinessObjects Data Services
    Eventually, my recommendation is to develop Python code outside of BODS and later import the UDT from XML or ATL.
  • Unicode Getfield crash: 1256795 - User Defined Transform with Python code is crashing - Data
    Services

Attached you can find the export of the User Defined Transform.

Summary

With the above UDT you can easily build dataflows to extract data from Salesforce:

  1. You may start with a Row Generator. 
  2. with a Query transform you may set the two incoming parameters:
    1. a query name - just to identify your query
    2. the query itself
  3. The next step is the UDT itslelf: you need to connect the inputs, and the output back to the dataflow. This transform should produce a single column with '[query] loaded successfully' after the results were written as CSV to the local drive.
  4. With a second dataflow you would need to process the CSV   

*If you find the above solution useful please leave a comment with your country / city. I would like to create a map with pins - hopefully - around the globe.

 

1 Comment