on ‎2025 Mar 25 6:13 PM
How to connect to GCP Storage Bucket provided by SAP with Handover files to transfer Backup to SAP Rise
The purpose of this blog is to demonstrate how to connect to the GCP Bucket with the credentials provided by SAP via sharepoint through the files Handover "<sid><prefix-customer>_BROWNFIELD_CREDENTIALS.docx".
In this demonstration we will use an operating system SUSE Linux Enterprise Server 15 SP5.
Example of a file "<sid><prefix-customer>_BROWNFIELD_CREDENTIALS.docx".
Creating the .JSON file
Start by creating a .JSON file with the information contained in the file "_BROWNFIELD_CREDENTIALS.docx"
Copy the contents of the "JSON File Details" contained within the characters { }.
Example:
Paste into a notepad and save with the extension .json
Example:
Create a directory on the operating system from where you will transfer the backup and save the generated .json file. You can use the same directory for the next download and installation activities of the gsutil tool that will use this .json.
Installing of GSUTIL
Use this official documentation: https://cloud.google.com/storage/docs/gsutil_install?hl=pt-br#linux
Switch to the folder you created in the previous step with the command "cd".
Download:
curl -O https://dl.google.com/dl/cloudsdk/channels/rapid/downloads/google-cloud-cli-linux-x86_64.tar.gzExtract:
tar -xf google-cloud-cli-linux-x86_64.tar.gzInstalling:
./google-cloud-sdk/install.sh
You can use the default options during installation, in my case it was like this:
Log out and start a new session for the changes to take effect.
Register with the account contained in the .json
Use the command below changing it to the path of your .json file, you will have a result similar to the image.
gcloud auth activate-service-account --key-file=/gsutil/gcp-key1.json
Validate access to GCP Storage Bucket
Use the ls command to list the directories on your account.
Examples:
gsutil ls gs://sahecXXXtransferXXX01/
After the first copy just adjust the path to list sub directories
gsutil ls gs://sahecXXXtransferXXX01/HD4-21032025
In my case, I did not identify the account name, I had to ask the Project Lead because this information was not included in the brownfield file.
The list of bucket names should be returned with the command
gcloud storage ls --project=my-project
In the example below, I am listing the remote directory with a backup copy that I made.
Next, I will pass the command to execute a copy.
Copying your backup to GCP Bucket Storage
Run the command "gsutil -m cp -r SOURCE gs://TARGET/"
Example:
gsutil -m cp -r SOURCE-FOLDER-OF-YOUR-BACKUP gs://saheXXXtransferXX01/
After the copy is complete, you can list the files again with the command:
gsutil ls gs://sahecXXXtransferXXX01/HD4-21032025/
The -m parameter is used to copy files in parallel,
more details about advanced parameters can be found at this official link: https://cloud.google.com/sdk/gcloud/reference/storage/cp
In the example above, the 237GB copy took 2 hours, the speed will depend on the upload speed of your internet connection.
In the same example, I performed a HANA backup with a 20GB Split. This helps in case of a very large file failure so that you don't have to reprocess everything from the beginning.
After finishing, you just need to request a copy of the GCP Bucket backup to the /Migration directory, which is probably mounted on your Skeleton system database server. To do this, open the Service Request with the "Download Data from Cloud to Block Storage" template, providing details of the source and destination such as SID, hostname, IP, source and destination path.
Request clarification before answering.
| User | Count |
|---|---|
| 33 | |
| 17 | |
| 14 | |
| 13 | |
| 9 | |
| 4 | |
| 2 | |
| 1 | |
| 1 | |
| 1 |
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.