Artificial Intelligence and Machine Learning Discussions
Engage in AI and ML discussions. Collaborate on innovative solutions, explore SAP's AI innovations, and discuss use cases, challenges, and future possibilities.
cancel
Showing results for 
Search instead for 
Did you mean: 

SAP AI Core app & Azure Storage file Reading/Writing challenge

felixteched
Explorer
628

Dear SAP AI Community,

I am building an application on SAP AI Core to train a model and send an email with the output. The application is working fine when reading the file from azure storage location, the problem is when I want to write back the results of the training in CSV format to an specific folder in Azure storage location. 

There is no any error when I run the execution and nothing happens at storage level and the python function df.to_csv doesn't write the file to the storage, I think the issue is how connect my docker image / python code to send the ouput to the storage in Azure and persist it.

I was trying to use an output parameter in my yaml file, but even using it and configure result sets in SAP AI Core I was not able to save the file. Any idea would be very appreciated!!

Thanks

3 REPLIES 3

djreddy65
Explorer
0 Kudos
541

all the best

dominik_jacobs
Participant
0 Kudos
430

Hi,

without more information about your code (methods used, classes, implementation) and about the structure of your Docker image, it will be difficult to give you specific suggestions for improvement.

However, here are general points you can check. Because Write to azure storage works in general.

1.) Verify Azure Storage Connection Setup

Ensure that your Docker container running in SAP AI Core can connect to your Azure Storage account. This typically involves setting up environment variables and using a proper SDK for Azure Storage. As you can already read data, I assume that the storage connection is set up correctly.

2.) Use Azure Storage SDK in Your Python Code

Ensure you are using the Azure Storage SDK to write data to Azure Storage. The to_csv method does not directly support writing to Azure Storage, so you'll need to use for example the Azure Storage Blob SDK.

3.) Ensure Environment Variables are Set

Make sure the environment variable AZURE_STORAGE_CONNECTION_STRING is set in your Docker container or passed through your SAP AI Core configuration.

 

 environment:
    - name: AZURE_STORAGE_CONNECTION_STRING
      valueFrom:
        secretKeyRef:
          name: your-secret-name
          key: connection-string

 

4.) Configure Your YAML File Properly

Ensure that your YAML configuration file in SAP AI Core includes the necessary setup for the Azure Storage connection.

5.) Check Permissions

Ensure that the service principal or credentials used in the AZURE_STORAGE_CONNECTION_STRING have the appropriate permissions to write to the specified container in Azure Storage.

And if none of this helps, then build a detailed loggin into your Python program that monitors the connection to Azure Storage and the upload process.

Greetings

Dom

felixteched
Explorer
405

Hello, Thanks for the reply, so after checking documentation it is looks like we can not use directly to.csv function (as you mentioned previously), nevertheless we will investigate Azure Storage Blob SDK.

Using the output parameter i was able to create a model artifact containing the csv file with the output. The issue is the output file is written to a dynamic path using de default path ( /app/model ) instead the specified in the output parameter path ( /app/emails ). 

The path where it is stored is named as ai://default/e2ea567b43425380/sltoutput-email using an execution ID that I would need access to read the file and move to another location, anyway we will investigate the Blob SDK option.

Thank for the help!