
Azure Blob Storage is a cloud-based object storage service provided by Microsoft Azure. It allows users to store and retrieve large amounts of unstructured data, such as text, images, audio, video and binary files.
The objective of this article is to demonstrate the process of uploading data records into Microsoft Azure Blob Storage and subsequently accessing and utilizing this data within the SAP PaPM Cloud.
In order to access the Microsoft Azure Portal you'll need to have an account.
As part of the procedure, you will be required to navigate to three different applications.
Blob Storage offers three types of resources:
- The storage account
- A container in the storage account
- A blob in a container
Navigate to the "Shared Access Tokens" section and define permissions by activating all seven chosen options, then select the desired expiration timeframe.
After specifying the permissions and the expiry time, generate SAS token and URL and copy BLOB SAS token. The SAS token, which we've copied, will find an use in Step 5.
SAS tokens can be used to control access to containers, blobs and tables in Azure Storage. The permissions specified in a SAS token can include read, add, write, delete, list and more, and they can be limited to a specific time window.
Open your SQL console in the SAP HANA database explorer, and execute the subsequent commands to create PSE and certificate.
SELECT * FROM PSES;
CREATE PSE AZURE_BLOB; -- creating of PSE AZURE_BLOB
SELECT SUBJECT_COMMON_NAME, CERTIFICATE_ID, COMMENT, CERTIFICATE FROM CERTIFICATES;
CREATE CERTIFICATE FROM '-----BEGIN CERTIFICATE-----MIIDdzCCAl+gAwIBAgIEAgAAuTANBgkqhkiG9w0BAQUFADBaMQswCQYDVQQGEwJJ
RTESMBAGA1UEChMJQmFsdGltb3JlMRMwEQYDVQQLEwpDeWJlclRydXN0MSIwIAYD
VQQDExlCYWx0aW1vcmUgQ3liZXJUcnVzdCBSb290MB4XDTAwMDUxMjE4NDYwMFoX
DTI1MDUxMjIzNTkwMFowWjELMAkGA1UEBhMCSUUxEjAQBgNVBAoTCUJhbHRpbW9y
ZTETMBEGA1UECxMKQ3liZXJUcnVzdDEiMCAGA1UEAxMZQmFsdGltb3JlIEN5YmVy
VHJ1c3QgUm9vdDCCASIwDQYJKoZIhvcNAQEBBQADggEPADCCAQoCggEBAKMEuyKr
mD1X6CZymrV51Cni4eiVgLGw41uOKymaZN+hXe2wCQVt2yguzmKiYv60iNoS6zjr
IZ3AQSsBUnuId9Mcj8e6uYi1agnnc+gRQKfRzMpijS3ljwumUNKoUMMo6vWrJYeK
mpYcqWe4PwzV9/lSEy/CG9VwcPCPwBLKBsua4dnKM3p31vjsufFoREJIE9LAwqSu
XmD+tqYF/LTdB1kC1FkYmGP1pWPgkAx9XbIGevOF6uvUA65ehD5f/xXtabz5OTZy
dc93Uk3zyZAsuT3lySNTPx8kmCFcB5kpvcY67Oduhjprl3RjM71oGDHweI12v/ye
jl0qhqdNkNwnGjkCAwEAAaNFMEMwHQYDVR0OBBYEFOWdWTCCR1jMrPoIVDaGezq1
BE3wMBIGA1UdEwEB/wQIMAYBAf8CAQMwDgYDVR0PAQH/BAQDAgEGMA0GCSqGSIb3
DQEBBQUAA4IBAQCFDF2O5G9RaEIFoN27TyclhAO992T9Ldcw46QQF+vaKSm2eT92
9hkTI7gQCvlYpNRhcL0EYWoSihfVCr3FvDB81ukMJY2GQE/szKN+OMY3EU/t3Wgx
jkzSswF07r51XgdIGn9w/xZchMB5hbgF/X++ZRGjD8ACtPhSNzkE1akxehi/oCr0
Epn3o0WC4zxe9Z2etciefC7IpJ5OCBRLbf1wbWsaY71k5h+3zvDyny67G7fyUIhz
ksLi4xaNmjICq44Y3ekQEe5+NauQrz4wlHrQMz2nZQ/1/I6eYs9HRCwBXbsdtTLS
R9I4LtD+gdwyah617jzV/OeBHRnDJELqYzmp-----END CERTIFICATE-----'
COMMENT 'Azure';
After creating PSE and certificate, next step is to add the certificate to the PSE.
Run the following code to obtain the certificate ID.
SELECT CERTIFICATE_ID FROM CERTIFICATES WHERE COMMENT = 'Azure'; -- checking the certificate ID
ALTER PSE AZURE_BLOB ADD CERTIFICATE <CERTIFICATE_ID>; -- adding the certificate to PSE
Set the AZURE_BLOB as remote source
SET PSE AZURE_BLOB PURPOSE REMOTE SOURCE; -- setting the AZURE_BLOB as remote source
SELECT * FROM PSE_CERTIFICATES;
For more information go to this link Certificate Management.
Run the provided SQL code to save the storage account and shared access signature (SAS) as a credentials in the database.
CREATE CREDENTIAL FOR COMPONENT 'SAPHANAIMPORTEXPORT' PURPOSE 'Azure' TYPE 'PASSWORD' USING 'user=<storage_account_name>;password=<Blob_SAS_token>';
SELECT * FROM CREDENTIALS;
--DROP CREDENTIAL FOR COMPONENT 'SAPHANAIMPORTEXPORT' PURPOSE 'Azure' TYPE 'PASSWORD';
In SAP HANA Database Explorer, go to "Tables" section and by right clicking on it choose "Import Data" option.
New window will be opened and you will need to set up the following details:
In step 4, when you press "Compose" within the "Import Source" section, a new window will pop up and shows the parsed Azure path.
If you already have Table in Hana DB which you want to use for storing your data from Azure just select "Add to existing table" option, provide name of desired Table and Schema and skip nest steps.
In "Table Mapping step" you will need to map the source data to the destination table in your HANA database. This involves defining how the columns or fields in the source correspond to those in your database.
After completion, you should receive a "Import successful" message.
Table is now consumed successfully. You can open and check the data records:
Connection is established using the schema and HANA Table created in the preceding step.
For more information how to create a connection please go to the following link.
For more information how to create a function please go to the following link.
More information about Model Table HANA can be found here.
User can verify the successful consumption of data records in the SAP PaPM Cloud modeling perspective by reviewing the Show screen.
In SAP PaPM Cloud, your choice of Calculation and Processing functions can vary according to your specific needs. You have the flexibility to utilize data from Azure Blob Storage by accessing it through a Model Table HANA, Model View Hana Table or Model View HANA View, serving as your input source.
That concludes our discussion! I understand that integration is a multifaceted subject, but I trust that this concise blog post has shed light on how data from Azure Blob Storage can serve as inputs or data sources for intricate calculations, rules, and simulations within SAP PaPM Cloud.
Wishing you a wonderful day and looking forward to meet you in our upcoming blog posts.
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
User | Count |
---|---|
7 | |
2 | |
2 | |
1 | |
1 | |
1 | |
1 | |
1 | |
1 | |
1 |