Annalix
Annalix

Reputation: 480

Trigger the training of my packaged model with cloud function from google storage ***SOLVED**

The approach used in this post is obsolete: you can find the solution by using VertexAI in the post linked below:

Automate batch predictions with VertexAI pipeline and Kuberflow component ***SOLVED***

###############################

I am new with cloud functions and AI Platform Pipeline.

I have my customised model stored in GCS that I run from my laptop with the script training.sh

where training.sh is

gcloud ai-platform jobs submit training model_training_$now \
--scale-tier basic \
--packages gs://my_project_bucket/my_package_model-0.1.2.tar.gz \
--module-name model.train_pipeline \
--job-dir=gs://my_project_bucket/trained_model \
--region europe-west1 \
--runtime-version=2.5 \
--python-version=3.7 \
-- \
--user_first_arg=first_arg_value --user_second_arg=second_arg_value

I am trying to automate the training every time a new file is uploaded in the input_data bucket. I am using cloud functions to do this. However it is not clear to me how to use Kuberflow to run the training.sh file.

I am using this tutorial

https://cloud.google.com/blog/products/ai-machine-learning/using-remote-and-event-triggered-ai-platform-pipelines

and its notebook

https://github.com/amygdala/code-snippets/blob/master/ml/notebook_examples/functions/hosted_kfp_gcf.ipynb

Here he defines a sequential pipeline by creating containers.

def sequential_pipeline(filename='gs://ml-pipeline-playground/shakespeare1.txt'):
   """A pipeline with two sequential steps."""
   op1 = dsl.ContainerOp(
       name='filechange',
       image='library/bash:4.4.23',
       command=['sh', '-c'],
       arguments=['echo "%s" > /tmp/results.txt' % filename],
       file_outputs={'newfile': '/tmp/results.txt'})
   op2 = dsl.ContainerOp(
       name='echo',
       image='library/bash:4.4.23',
       command=['sh', '-c'],
       arguments=['echo "%s"' % op1.outputs['newfile']]
       ) 

I cannot see how to define a similar function to run my training.sh. Do I need to containerise my model package my_package_model-0.1.2.tar.gz?

Is anyone familiar with this type of automation?

Upvotes: 0

Views: 210

Answers (0)

Related Questions