on 05-21-2020 5:06 PM
Hello,
We have activated yesterday a SAP Data Intelligence, trial edition 3.0 on Cloud Appliance Library (AWS). When running the a python producer pipeline we get an error in the Artifact Producer.
Graph failure: ArtifactProducer: InArtifact: ProduceArtifact: http request creation error: parse ://:0/webhdfs/v1/shared/ml/artifacts/executions/f77c4f576b0142a0a7223d6e082dc706/artifactproducer1_0.zip?op=CREATE&overwrite=false: missing protocol scheme
This is our first experience with Data Hub/Data Intelligence and we have probems understanding the errors. It looks to me like it's trying to save the model to WEBHDFS, but this connection wasn't created during the installation. But we want to use the Semantic Data Lake which is already in place.
Can you please tell us how to get this working.
Thanks
Hello Mike, unfortunately this is a bug in how the Trial Edition was configured for you. We're working on fixing it. Fortunately for now you can manually correct this problem in the Connection Manager.
First export the DI_DATA_LAKE connection in the Connection Manager. Be sure to only select that connection.
Then in the connection.json file search for "type":"S3" and replace this string with "type":"SDL"
Delete the current DI_DATA_LAKE connection in Connection Manager (that you just exported) and then import the modified connection.json file. You will have to re-enter your AWS key credentials that you used when you deployed the Trial Edition.
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
Hi dimitri
Thank you for your answers.
I had the same issue and don't have an option to export the connection(data-lake-connection-issue.jpgplease check attachment)
Is it because it is trial edition..?
Without accessing data lake we can't do any trial run on any simple pipelines, so i am stuck, please help.
kind regards,
Chitra T.
dimitri
Thankyou for pointing out the correct tab.
Unfortunately, the JSON file looks OK with "type":"SDL" and I didn't have to change it.
So, the problem remains and we can't produce our model and stuck with our first simple pipeline.
(1) Please let us know if you have anyone on current version of SAP Trail DI, for whom the DI_DATA_LAKE connection works so we can ask them if they have done anything else to make it work.
(2) Or of you think anything else could affect the connection please let us know.
(3) Is there another way to produce artefact and create artefact ID and store it somewhere and re-use in inference.
I also tried with anew S3 storage connection and tried to write the model artefact to S3 after changing the configuration and code in producer operator, but no luck, as 'file' type model export is no longer supported as per error message.
Please help.
regards,
Chitra T
Chitra, please see my response here: https://answers.sap.com/answers/13209197/view.html
Actually you should be able to upgrade the system manually but you will need a Jump Host , either Linux or Win64 and activate the necessary tools like I have described in the Blog - prepare the Installation Host for the SLC Bridge
Best Regards Roland
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
Hello
As probably no one tested everything in the final version of DI 3.0 you might want to update to version 3.0.2 to test again.
The connection to ADL_V2 didn’t worked with a lower version so far ...
https://blogs.sap.com/2020/03/05/sap-data-intelligence-3.0-implement-with-slcb-tool/
best regards Roland
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
User | Count |
---|---|
84 | |
10 | |
8 | |
7 | |
5 | |
5 | |
5 | |
5 | |
4 | |
4 |
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.