Technology Blogs by Members
Explore a vibrant mix of technical expertise, industry insights, and tech buzz in member blogs covering SAP products, technology, and events. Get in the mix!
cancel
Showing results for 
Search instead for 
Did you mean: 
mellinda_pereira
Explorer
Scenario: We wanted to show how we can upload a csv file to Google cloud storage and then create a table based on it in Big Query and then import this table in SAP Datasphere via Import Remote tables

 

1) In GCP cloud storage we need to create a bucket


 

Give it a name

Next add a label (this is optional)


Choose where you want this bucket to be stored, I kept it as default what was selected


 

Choose storage class for your data (hot, cold, nearline) I kept it as standard


Choose how to control access on this bucket (like if you don’t want to make it public)


I uncheck the above highlighted since I just wanted to try if anyone can access this bucket.

Next is security on this bucket

I kept it as default.Next click Create


When you click on Buckets you see the bucket has been created.

Next, we will upload a .csv file to this bucket. (we can load any format file and then we need to write a script to convert any incoming file format to convert it into csv) this I will cover in later part.


I uploaded a simple .csv file


Now we will access this csv file from this cloud storage into Big Query

In the Big Query under explorer, I will create a dataset (this is like a schema where you will store all the tables) Click on Project


 

Next Create data set and give a proper business name




Data set (schema) is created. Now we will create a table via the cloud storage csv file.

Select the create table from Google cloud storage, Next select the bucket and then the file


Give table a name


 

Table is created and we can preview the data


 


 

Now we are going to create a connection in SAP Datasphere for Big Query.

In SAP Datasphere under connections click create and select Google BigQuery


 


We need to give Project name and Key (Access Key which we need to generate via API)

You will get the project id when you click on My First Project

Next for Key we will go to APIs and services from the navigational menu



Next click on Enabled APIs and services, scroll down you and click on BigQuery Connection API


 

Next click on credentials


Now you need to have a service account.


Let’s create a Service Account

Go to IAM and admin in Google console from the Navigational menu and click on Service accounts


At the top click on Create Service account


Give a relevant service account name


Click on create and you get a pop-up service account is created


Now we will see this service account In the BigQuery connection API


Click on it and go to Keys


Now click on Add key (recommended is JSON)


Click on Create, a key file will get downloaded. Store it securely since we will use it in the connection


Now under create connection in SAP Datasphere (DWC) Enter the project id and select the key document.


Click on Next step and give technical name


Validate the connection


Now go to Data Builder in Datasphere and click on Import Remote tables


 

Select the connection and then browse your project and select the table

 


 

And click Import and deploy and there it is the table is imported.


 

And finally, the table has been imported successfully in Datasphere. Further you can create Graphical view, Entity Relationship, Analytical model etc.

Hope this article was helpful.
7 Comments
Labels in this area