andres acosta
andres acosta

Reputation: 11

DataFlow JOB IAM Can not allocate sha384 (reason: -2)

I am trying to launch my workflow but it generates this error.

command:python nbo.py --temp_location gs://xxxxx/tmp/ --project xxxxx --region us-central1 --runner DataflowRunner --job_name xxxxx --output_table xxxxx.xxxxx --input_subscription projects/xxxxx/subscriptions/xxxxx

mistake: "ima: Can not allocate sha384 (reason: -2)" enter image description here

Does anyone have the solution?

Upvotes: 1

Views: 1383

Answers (1)

Mike Williamson
Mike Williamson

Reputation: 3260

This is a failure for the disk to attach. In my experience, often it "doesn't matter", meaning that it fails on the first or second try, but later it is able to connect.

But there is Dataflow documentation on this:

Failed to attach disk

When you try to launch a Dataflow job that uses C3 VMs with Persistent Disk, the job fails with one or both of the following errors:

Failed to attach disk(s), status: generic::invalid_argument: One or more operations had an error

Can not allocate sha384 (reason: -2), Spectre V2 : WARNING: Unprivileged eBPF is enabled with eIBRS on...

These errors occur when you use C3 VMs with an unsupported Persistent Disk type. For more information, see Supported disk types for C3.

To use C3 VMs with your Dataflow job, choose the pd-ssd worker disk type. For more information, see Worker-level options.

[Python] --worker_disk_type=pd-ssd

Here is the link with details.

Upvotes: 0

Related Questions