TCR
TCR

Reputation: 45

Argo: Variable number of output artifacts

In one of my Argo workflow steps a Docker container splits up a large file into a number of smaller files.

The tutorials show how one can save a small and pre-determined number of outputs (e.g., 2 or 3) as artifacts in an S3 bucket by going through each output one at a time.

In my use case, I do not know in advance how many smaller files will be created; it can be upwards of hundreds. The large number of output files makes it hard, if not impossible to follow the tutorials and specify each one by one even if I know how many smaller files are create in advance.

Is there a way to save all the outputs to an S3 bucket?

Upvotes: 2

Views: 1674

Answers (1)

Argo Proj
Argo Proj

Reputation: 146

This sounds like standard output artifacts. You can put all your files in a single directory, and then have the directory be the output artifacts.

Here are some examples to help you:

https://argoproj.github.io/argo-workflows/examples/#artifacts

Upvotes: 1

Related Questions