cancel
Showing results for 
Search instead for 
Did you mean: 

Artifact Producer error

Hello,

We have activated yesterday a SAP Data Intelligence, trial edition 3.0 on Cloud Appliance Library (AWS). When running the a python producer pipeline we get an error in the Artifact Producer.

Graph failure: ArtifactProducer: InArtifact: ProduceArtifact: http request creation error: parse ://:0/webhdfs/v1/shared/ml/artifacts/executions/f77c4f576b0142a0a7223d6e082dc706/artifactproducer1_0.zip?op=CREATE&overwrite=false: missing protocol scheme

This is our first experience with Data Hub/Data Intelligence and we have probems understanding the errors. It looks to me like it's trying to save the model to WEBHDFS, but this connection wasn't created during the installation. But we want to use the Semantic Data Lake which is already in place.

Can you please tell us how to get this working.

Thanks

View Entire Topic
former_member255270
Active Participant

Hello Mike, unfortunately this is a bug in how the Trial Edition was configured for you. We're working on fixing it. Fortunately for now you can manually correct this problem in the Connection Manager.

First export the DI_DATA_LAKE connection in the Connection Manager. Be sure to only select that connection.

Then in the connection.json file search for "type":"S3" and replace this string with "type":"SDL"

Delete the current DI_DATA_LAKE connection in Connection Manager (that you just exported) and then import the modified connection.json file. You will have to re-enter your AWS key credentials that you used when you deployed the Trial Edition.

0 Kudos

it works now, thank you very much!!!

0 Kudos

Hi dimitri

Thank you for your answers.

I had the same issue and don't have an option to export the connection(data-lake-connection-issue.jpgplease check attachment)

Is it because it is trial edition..?

Without accessing data lake we can't do any trial run on any simple pipelines, so i am stuck, please help.

kind regards,

Chitra T.

former_member255270
Active Participant
0 Kudos

chitra27 You are in the wrong sub-tab of the Connection Manager. In your screenshot I can see you're looking at "Connection Types" but you need to go to "Connections" tab.

0 Kudos

dimitri

Thankyou for pointing out the correct tab.

Unfortunately, the JSON file looks OK with "type":"SDL" and I didn't have to change it.

So, the problem remains and we can't produce our model and stuck with our first simple pipeline.

(1) Please let us know if you have anyone on current version of SAP Trail DI, for whom the DI_DATA_LAKE connection works so we can ask them if they have done anything else to make it work.

(2) Or of you think anything else could affect the connection please let us know.

(3) Is there another way to produce artefact and create artefact ID and store it somewhere and re-use in inference.

I also tried with anew S3 storage connection and tried to write the model artefact to S3 after changing the configuration and code in producer operator, but no luck, as 'file' type model export is no longer supported as per error message.

Please help.

regards,

Chitra T

former_member255270
Active Participant
0 Kudos

Hi Chitra, what is the error you are experiencing? It sounds like this may be an entirely different issue that what is originally desribed.

0 Kudos
Hi dimitri,Thank you for your continued support.
It is the same issue to make the DI_DATA_LAKE connection in trial version work with SDL type.The standard artefact producer points to SDL which is in DI_DATA_LAKE so trying the same.I just mentioned that I tried with S3 storage also but in vain.FYI, we have hosted on GCPPlease help.regards,
former_member255270
Active Participant
0 Kudos

Chitra, please see my response here: https://answers.sap.com/answers/13209197/view.html

0 Kudos

Thank you very much

Chitra T

former_member709354
Discoverer
0 Kudos

Is there a way to create artifacts names dynamically?