Reputation: 31
I created a dataproc cluster with 6 node clusters and face below issue when I want to install bdutil :
******************* gcloud compute stderr *******************
ERROR: (gcloud.compute.disks.create) Could not fetch resource:
- Insufficient Permission
ERROR: (gcloud.compute.disks.create) Could not fetch resource:
- Insufficient Permission
ERROR: (gcloud.compute.disks.create) Could not fetch resource:
- Insufficient Permission
ERROR: (gcloud.compute.disks.create) Could not fetch resource:
- Insufficient Permission
ERROR: (gcloud.compute.disks.create) Could not fetch resource:
- Insufficient Permission
************ ERROR logs from gcloud compute stderr ************
ERROR: (gcloud.compute.disks.create) Could not fetch resource:
ERROR: (gcloud.compute.disks.create) Could not fetch resource:
ERROR: (gcloud.compute.disks.create) Could not fetch resource:
ERROR: (gcloud.compute.disks.create) Could not fetch resource:
ERROR: (gcloud.compute.disks.create) Could not fetch resource:
******************* Exit codes and VM logs *******************
Sun Sep 23 23:54:02 UTC 2018: Exited 1 : gcloud --project=hdpgcp-217320 --quiet --verbosity=info compute disks create --size=1500 --type=pd-standard hadoop-w-0-pd --zone=zone(
unset)
Sun Sep 23 23:54:02 UTC 2018: Exited 1 : gcloud --project=hdpgcp-217320 --quiet --verbosity=info compute disks create --size=1500 --type=pd-standard hadoop-w-1-pd --zone=zone(
unset)
Sun Sep 23 23:54:02 UTC 2018: Exited 1 : gcloud --project=hdpgcp-217320 --quiet --verbosity=info compute disks create --size=1500 --type=pd-standard hadoop-w-2-pd --zone=zone(
unset)
Sun Sep 23 23:54:02 UTC 2018: Exited 1 : gcloud --project=hdpgcp-217320 --quiet --verbosity=info compute disks create --size=1500 --type=pd-standard hadoop-w-3-pd --zone=zone(
unset)
Sun Sep 23 23:54:02 UTC 2018: Exited 1 : gcloud --project=hdpgcp-217320 --quiet --verbosity=info compute disks create --size=1500 --type=pd-standard hadoop-m-pd --zone=zone(un
set)
Upvotes: 0
Views: 400
Reputation: 2099
HDP and Dataproc are different products. I mean that you don't need to create a Dataproc cluster to execute bdutil. It is enough execute it from a single instance because all the required configuration is set in bdutil_env.sh/ambari.conf. Tool bdutil don't create any Dataproc cluster, instead custom vm instances are created to host the HDP.
Here are some steps that are not very well documented:
I set the GOOGLE_APPLICATION_CREDENTIALS variable and permission issue was gone. Most probably this is the issue you face.
1.1 if it doesn't work, execute this command: gcloud auth activate-service-account --key-file=/PATH/JSON_CREDENTIALS
If other errors appear like 'Invalid value zone(unset)', just set them in bdutil_env.sh
2.1 If same errors remain, go directly to platforms/hdp/ambari.conf to update your configuration.
You will need to set up permissive firewall rules to access your instances, to allow communications between the nodes and for you to access Ambari in the master.
After completed steps above I could use Ambari to install HDP.
Upvotes: 1