cancel
Showing results for 
Search instead for 
Did you mean: 
Read only

How to connect to GCP Storage Bucket provided with Handover files to transfer Backup to SAP Rise

marcusbraga
Participant
845

How to connect to GCP Storage Bucket provided by SAP with Handover files to transfer Backup to SAP Rise

The purpose of this blog is to demonstrate how to connect to the GCP Bucket with the credentials provided by SAP via sharepoint through the files Handover "<sid><prefix-customer>_BROWNFIELD_CREDENTIALS.docx".

In this demonstration we will use an operating system SUSE Linux Enterprise Server 15 SP5.

 

Example of a file "<sid><prefix-customer>_BROWNFIELD_CREDENTIALS.docx".

marcusbraga_0-1742913667481.png

 

Creating the .JSON file

Start by creating a .JSON file with the information contained in the file  "_BROWNFIELD_CREDENTIALS.docx"

Copy the contents of the "JSON File Details" contained within the characters { }.

Example:

marcusbraga_1-1742914711062.png

Paste into a notepad and save with the extension .json

Example:

marcusbraga_2-1742915154747.png

 

Create a directory on the operating system from where you will transfer the backup and save the generated .json file. You can use the same directory for the next download and installation activities of the gsutil tool that will use this .json.

 

Installing of GSUTIL

Use this official documentation: https://cloud.google.com/storage/docs/gsutil_install?hl=pt-br#linux

Switch to the folder you created in the previous step with the command "cd".

Download: 

curl -O https://dl.google.com/dl/cloudsdk/channels/rapid/downloads/google-cloud-cli-linux-x86_64.tar.gz

Extract:

tar -xf google-cloud-cli-linux-x86_64.tar.gz

Installing:

./google-cloud-sdk/install.sh

You can use the default options during installation, in my case it was like this:

marcusbraga_4-1742916533702.png

Log out and start a new session for the changes to take effect.

 

Register with the account contained in the .json

Use the command below changing it to the path of your .json file, you will have a result similar to the image.

gcloud auth activate-service-account --key-file=/gsutil/gcp-key1.json

marcusbraga_6-1742922543436.png

 

Validate access to GCP Storage Bucket

Use the ls command to list the directories on your account.

Examples:

gsutil ls gs://sahecXXXtransferXXX01/

After the first copy just adjust the path to list sub directories

gsutil ls gs://sahecXXXtransferXXX01/HD4-21032025

 

In my case, I did not identify the account name, I had to ask the Project Lead because this information was not included in the brownfield file.

The list of bucket names should be returned with the command

gcloud storage ls --project=my-project

In the example below, I am listing the remote directory with a backup copy that I made.

Next, I will pass the command to execute a copy.

marcusbraga_7-1742922985190.png

 

Copying your backup to GCP Bucket Storage

Run the command "gsutil -m cp -r SOURCE gs://TARGET/"

Example:

gsutil -m cp -r SOURCE-FOLDER-OF-YOUR-BACKUP gs://saheXXXtransferXX01/

marcusbraga_8-1742923951889.png

After the copy is complete, you can list the files again with the command:

gsutil ls gs://sahecXXXtransferXXX01/HD4-21032025/

The -m parameter is used to copy files in parallel,
more details about advanced parameters can be found at this official link: https://cloud.google.com/sdk/gcloud/reference/storage/cp

In the example above, the 237GB copy took 2 hours, the speed will depend on the upload speed of your internet connection.

In the same example, I performed a HANA backup with a 20GB Split. This helps in case of a very large file failure so that you don't have to reprocess everything from the beginning.

 

 

After finishing, you just need to request a copy of the GCP Bucket backup to the /Migration directory, which is probably mounted on your Skeleton system database server. To do this, open the Service Request with the "Download Data from Cloud to Block Storage" template, providing details of the source and destination such as SID, hostname, IP, source and destination path.

 

 

Accepted Solutions (0)

Answers (0)