AlexDrenea
AlexDrenea

Reputation: 8039

Updating a Dataflow job from the rest API

I am trying to programatically update a cloud DataFlow job by using the REST API as described here

I have a PubSub to BigQuery job and my end goal is to replace the BigQuery output table.

I've tried updating the current job with a new job by using the replacedByJobId field but always getting this error:

{ "error": { "code": 400, "message": "(b7fd8310f1b85ccf): Could not modify workflow; invalid modifier value: 0", "status": "INVALID_ARGUMENT" } }

Request body:

{ "id": "jobid", "projectId": "projectId", "replacedByJobId = "newJobId", }

Is there another way to either replace a running job's parameters (OutputTable) or replace a running job with a new similar job?

Upvotes: 1

Views: 1266

Answers (2)

Muruganandam C
Muruganandam C

Reputation: 97

java -jar pipeline/build/libs/pipeline-service-1.0.jar \
        --project=my-project \
        --zone=us-central1-f \
        --streaming=true \
        --stagingLocation=gs://my-bucket/tmp/dataflow/staging/ \
        --runner=DataflowPipelineRunner \
        --numWorkers=5 \
        --workerMachineType=n1-standard-2 \
        --jobName=ingresspipeline \
        --update

Upvotes: 0

Scott Wegner
Scott Wegner

Reputation: 7493

In order to update a job you also need to provide a compatible replacement job. Note that update is currently only supported using the Java SDK.

You can find documentation on updating using the Java SDK at: Updating an Existing Pipeline: Launching Your Replacement Job.

Upvotes: 1

Related Questions