dani herrera
dani herrera

Reputation: 51645

Google DataFlow fails writing on bucket

I'm not able to run sample wordcount dataflow. An error appear:

(venv) dh4@GLOW:$ python -m apache_beam.examples.wordcount \
                 --project "$PROJECT"   --runner DataflowRunner \
                 --staging_location $BUCKET/staging \
                 --temp_location $BUCKET/tmp \
                 --output $BUCKET/results/output

IOError: Could not upload to GCS path gs://prova_df/staging/beamapp-dh4-1014444431-510588.1505555051.55555: access denied. Please verify that credentials are valid and that you have write access to the specified path.

But I have not permission problems in this bucket:

(venv) dh4@GLOW:$ gsutil cp  paraules.txt gs://prova_df
Copying file://paraules.txt [Content-Type=text/plain]...
- [1 files][   24.0 B/   24.0 B]                                                
Operation completed over 1 objects/24.0 B. 

Some extra info:

Upvotes: 0

Views: 591

Answers (1)

dani herrera
dani herrera

Reputation: 51645

After writing the question, I closed my laptop and I went for a walk. I take a picture of an octopus.

enter image description here

Then I come back to office, start laptop again and when I try again without change any single line of config neither code it works without problems:

enter image description here

Then, it looks that take for a while to enable all resources. Just be patient and go hunting octopus (with your camera)

Upvotes: 3

Related Questions