on 2023 Dec 11 3:18 PM
Hi folk,
I am working on SAP House Prediction usecase but after deployment I try to post it on Postman by using this generated url " https://api.ai.prod.us-east-1.aws.ml.hana.ondemand.com/v2/inference/deployments/d1994d2a0c950f4a/v2/..." but its showing error after posting the request " FileNotFoundError: [Errno 2] No such file or directory: "/mnt/models/model.pkl" .
SAP AI Core Deployment-
AWS S3 bucket(model.pkl)-
I have dought that I encountered the error because of this STORAGE_URI in executable.yaml which is present in GitHub " set -e && echo "Starting" && gunicorn --chdir /app/src main:app -b 0.0.0.0:9001 # filename `main` flask variable `app` env: - name: STORAGE_URI # Required value: "{{inputs.artifacts.housepricemodel}}" # Required reference from artifact name, see above - name: greetingmessage # different name to avoid confusion value: "{{inputs.parameters.greetmessage}}"
GitHub(Server-Executable.yaml)
what exactly the cause of error and can anyone please address me to solve this error?
Request clarification before answering.
Hi all,
I'm having extactly the same issue when trying the generate the predictions ...
Was there a solution to this issue in the end ?
Thanks a lot !
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
could you share your Model artifact and Configuration for this deployment?
Generally this is because of the mismatching your input artifact's url and its real path in your object store.
BR,
Eric
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.
looks good...
However probably we need to check it step by step:
1, download the model (model.pkl) from S3 into your local and put it into folder /mnt/models, and start this application from your local, let's see whether there has the issue or not.
2, build the image and run the image from local.
3, double check the model file in S3, because I used the HDFS as object store, so the archive {} is not supported, I have to uncompress it again and pushed to HDFS, and create a model (artifact) by myself, then create the configuration and start deployment, finally I passed this error.. but this is due to HDFS, if you're using S3, I think there should have no issue.
BR,
Eric
Configuration:
After deployment completed, the log:
INFO:root:Successfully copied hdfs://example-dataset/house-price-toy/model/model.pkl to /mnt/models
INFO:root:destination file path [/mnt/models/model.pkl]
INFO:hdfs.client:Downloading '/example-dataset/house-price-toy/model/model.pkl' to '/mnt/models'.
Inference call:
hopefully this would help you.
BR,
Eric
User | Count |
---|---|
82 | |
29 | |
9 | |
8 | |
7 | |
7 | |
7 | |
6 | |
6 | |
5 |
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.