user19195895
user19195895

Reputation: 41

Kubeflow pipeline fails in GCP - using cluster with Kubeflow pipeline integartion

I am using kubeflow v2 to compile my script and uploading that yaml file to Kubeflow. The runs are not succeeding and giving me below error:

FileNotFoundError: [Errno 2] No such file or directory: '/gcs/my_bucket/tfx_taxi_simple/7e62cf81-31a1-42bd-b145-47c3d1f24758/pipeline/test-updated2/899ccb6a-0f39-4cc8-8448-ef8c614009cc/get-dataframe/df_path.csv'
F0202 13:03:30.504261      16 main.go:50] Failed to execute component: exit status 1
time="2023-02-02T13:03:30.507Z" level=error msg="cannot save artifact /tmp/outputs/test_df_path/data" argo=true error="stat /tmp/outputs/test_df_path/data: no such file or directory"
time="2023-02-02T13:03:30.507Z" level=error msg="cannot save artifact /tmp/outputs/train_df_path/data" argo=true error="stat /tmp/outputs/train_df_path/data: no such file or directory"
Error: exit status 1
Runtime execution graph. Only steps that are currently running or have already completed are shown.

Whereas when I am running the sample kubeflow pipelines it runs fine. Have observed that the yaml file I am uploading has a different template than the sample ones.

Not sure if it is a version issue or something else.

Can anyone help me resolve.

I have tried running the scripts on Colab and generating the yaml for both pipelines,they both generate different yaml. I am expecting to run my Kubeflow pipeline which is using a kfp v2 compiler for generating the yaml .

Upvotes: 1

Views: 235

Answers (0)

Related Questions