LexByte
LexByte

Reputation: 412

Upload pipeline on kubeflow

I am currently trying to setup a kubeflow pipeline. My use case requires that the configuration for pipelines shall be provided via a yaml/json structure. Looking into the documentation for submitting pipelines I came across this paragraph:

Each pipeline is defined as a Python program. Before you can submit a pipeline to the Kubeflow Pipelines service, you must compile the pipeline to an intermediate representation. The intermediate representation takes the form of a YAML file compressed into a .tar.gz file.

Is is possible to upload/submit a pipeline to KubeFlow via json representation or any other representation instead of a zip file(tar.gz) representation? Is there a way to bypass the filesystem persistence of files(zips and tar.gz) and add them into database as a yaml/json representation?

Upvotes: 4

Views: 1509

Answers (1)

Dilip Sharma
Dilip Sharma

Reputation: 71

When you compile your python pipeline code then it results in a compressed file containing a YAML file. You can take out the YAML file after decompressing it and you can add its contents to your database table.

Later If you want to upload it to Kubeflow then use the following code:

 pipeline_file_path = 'pipelines.yaml' # extract it from your database
 pipeline_name = 'Your Pipeline Name'

 client = kfp.Client()
 pipeline = client.pipeline_uploads.upload_pipeline(
                                pipeline_file_path, name=pipeline_name)

Upvotes: 2

Related Questions