Technology Blog Posts by SAP
Learn how to extend and personalize SAP applications. Follow the SAP technology blog for insights into SAP BTP, ABAP, SAP Analytics Cloud, SAP HANA, and more.
cancel
Showing results for 
Search instead for 
Did you mean: 
gvglupita
Advisor
Advisor
3,095
This blog describes the process to connect Google Cloud with SAP Data Intelligence, install the ODBC driver and create a pipeline to write data into a table in Google BigQuery.

1.- Create the connection between SAP Data Intelligence and Google Cloud

  • Login to your Google Cloud Storage Account, navigate to IAM & Admin -> Service Accounts

  • Create a private key and download it in Json format on your local machine



Google Cloud Platform Service Accounts




  • Open Connection Management from SAP Data Intelligence and create a new connection of type Google Cloud Platform BigQuery.The project id should match with the project id from GCP. Open the json file to view the project id



Private Key




  • Provide the key file downloaded in json format, test the connection and click on create



Create Connection


2.- Install the ODBC driver


SAP Data Intelligence Launchpad




  • Create the vsolution area, here the code:
    mkdir -p gcp_bigquery_vsolution/content/files/flowagent​




VSCode




  • Go to System Management on SAP Data Intelligence Launchpad, click on Files tab and create vsolution manifest.json file inside gcp_bigquery_vsolution/ directory



Creating Manifest File


Here the code:



{
"name": "vsolution_gcp_bigquery",
"version": "1.0.0",
"format": "2",
"dependencies": []
}


  • Extract the downloaded compressed TAR archive SimbaODBCDriverforGoogleBigQuery_2.5.0.1001-Linux.tar.gz then extract again the compressed TAR archive SimbaODBCDriverforGoogleBigQuery64_2.5.0.1001.tar.gz inside gcp_bigquery_vsolution/content/files/flowagent/ directory



Importing TAR File




  • Then copy the file GoogleBigQueryODBC.did that is located in SimbaODBCDriverforGoogleBigQuery_2.5.0.1001-Linux/ directory inside gcp_bigquery_vsolution/content/files/flowagent/SimbaODBCDriverforGoogleBigQuery64_2.5.0.1001/lib/ directory



Importing DID File




  • (Optional) Configure the driver to display proper error messages, here the code:
    mv gcp_bigquery_vsolution/content/files/flowagent/SimbaODBCDriverforGoogleBigQuery64_2.5.0.1001/ErrorMessages/en-US gcp_bigquery_vsolution/content/files/flowagent/SimbaODBCDriverforGoogleBigQuery64_2.5.0.1001/lib/en-US​




Configuring Error Messages




  • The gcp_bigquery_vsolution/content/files/flowagent/SimbaODBCDriverforGoogleBigQuery64_2.5.0.1001/lib directory should have the following structure:



lib/ Directory



lib/en-US Directory




  • Inside gcp_bigquery_vsolution/content/files/flowagent/ directory create a properties file gcp_bigquery.properties with driver manager relative to the location of the properties file



Creating Properties File


Here the code:



GOOGLEBIGQUERY_DRIVERMANAGER=./SimbaODBCDriverforGoogleBigQuery64_2.5.0.1001/lib/libgooglebigqueryodbc_sb64.so


  • Extract and compress the vsolution from gcp_bigquery_vsolution/ directory and create a zip file



Export File



Exporting gcp_bigquery_vsolution Directory




  • Start the System Management application from the Launchpad then click in the Tenant tab. Click +, and then select the newly created gcp_bigquery_vsolution.zip



Add Solution to Tenant



Importing Solution




  • After the import is complete choose the Strategy subtab, and then click Edit. Then select your newly imported solution vsolution_gcp_bigquery-1.0.0 click on Add. Click Save



Edit Strategy



Add Solution



Save Changes




  • For changes to take effect, restart the flowagent application



Restart Flowagent Application



Flowagent Application Restarted


3.- Create a pipeline to write into a Google Cloud BigQuery table

  • Go to Modeler on SAP Data Intelligence Launchpad and create a pipeline that contains a Google BigQuery Table Producer operator



Google BigQuery Pipeline




  • This is the configuration; we are selecting the connection that we create on step 1. Also, we should have to select the dataset and the table in Google BigQuery



Google BigQuery Table Producer Configuration




  • Then run the pipeline to write into the table. In this example we have a CSV source file. Before we run the pipeline the table is empty



Google BigQuery Empty Table





  • Then we run the pipeline





Google BigQuery Pipeline Execution




  • When the execution is completed we check the table in Google BigQuery



Google BigQuery Table


 

So, in this blog post you learned how to connect SAP Data Intelligence with Google Cloud Platform and also how to install the ODBC driver for Google BigQuery and write data into a table.

I hope you found this blog post helpful. For any questions or feedback just leave a comment below this post. Thanks for reading. Stay tuned 😊

You can follow the SAP Data Intelligence tag to receive updates on blog posts here.

Please follow my profile gvglupita#overview for future posts. Have fun learning SAP Data Intelligence.