Reputation: 51645
I'm not able to run sample wordcount dataflow. An error appear:
(venv) dh4@GLOW:$ python -m apache_beam.examples.wordcount \
--project "$PROJECT" --runner DataflowRunner \
--staging_location $BUCKET/staging \
--temp_location $BUCKET/tmp \
--output $BUCKET/results/output
IOError: Could not upload to GCS path gs://prova_df/staging/beamapp-dh4-1014444431-510588.1505555051.55555: access denied. Please verify that credentials are valid and that you have write access to the specified path.
But I have not permission problems in this bucket:
(venv) dh4@GLOW:$ gsutil cp paraules.txt gs://prova_df
Copying file://paraules.txt [Content-Type=text/plain]...
- [1 files][ 24.0 B/ 24.0 B]
Operation completed over 1 objects/24.0 B.
Some extra info:
owner project role
on service account
. rm ~/.gsutil/credstore
to ensure no cache credentials.export GOOGLE_APPLICATION_CREDENTIALS
Upvotes: 0
Views: 591
Reputation: 51645
After writing the question, I closed my laptop and I went for a walk. I take a picture of an octopus.
Then I come back to office, start laptop again and when I try again without change any single line of config neither code it works without problems:
Then, it looks that take for a while to enable all resources. Just be patient and go hunting octopus (with your camera)
Upvotes: 3