xyphan
xyphan

Reputation: 105

Create cluster with Shared Network in GKE

I’m trying to create a cluster in GKE project-1 with shared network of project-2.

Roles given to Service account:
project-1: Kubernetes Engine Cluster Admin, Compute Network Admin, Kubernetes Engine Host Service Agent User
project-2: Kubernetes Engine Service Agent, Compute Network User, Kubernetes Engine Host Service Agent User

Service Account is created under project-1. API & Services are enabled in both Projects.

But I am getting this error persistently. Error: googleapi: Error 403: Kubernetes Engine Service Agent is missing required permissions on this project. See Troubleshooting | Kubernetes Engine Documentation | Google Cloud for more info: required “container.hostServiceAgent.use” permission(s) for “projects/project-2”., forbidden

data "google_compute_network" "shared_vpc" {
    name = "network-name-in-project-2"
    project = "project-2"
}

 
data "google_compute_subnetwork" "shared_subnet" {
    name = "subnet-name-in-project-2"
    project = "project-2"
    region = "us-east1"
}

 # cluster creation under project 1
 # project 1 specified in Provider 
resource "google_container_cluster" "mowx_cluster" {
    name = var.cluster_name
    location = "us-east1"
    initial_node_count = 1
 
    master_auth {
        username = ""
        password = ""
 
        client_certificate_config {
            issue_client_certificate = false
        }
    }
 
    remove_default_node_pool = true
    cluster_autoscaling {
        enabled = false
    }
 
    # cluster_ipv4_cidr = var.cluster_pod_cidr
    ip_allocation_policy {
        cluster_secondary_range_name = "pods"
        services_secondary_range_name = "svc"
    }
 
    network = data.google_compute_network.shared_vpc.id
    subnetwork = data.google_compute_subnetwork.shared_subnet.id
}

Upvotes: 3

Views: 7358

Answers (3)

zegoat7
zegoat7

Reputation: 579

The steps I followed to resolve the same issue:

On the service project level:

  1. Check Kubernetes Engine API enabled on the service project
  2. Check both service accounts are created by clicking on the Include Google-provided role grants option in the upper-right corner of the Google IAM console
    1. GKE service acc (service-SERVICE_PROJECT_NUM@container-engine-robot.iam.gserviceaccount.com)
    2. Google API service acc ([email protected])

On the host project level:

  1. Check Kubernetes Engine API enabled on the host project
  2. Grant the GKE service acc shown above the roles/container.hostServiceAgentUser and roles/compute.networkUser roles
  3. Grant the Google API service acc shown above the roles/compute.networkUser role

This will allow the Kubernetes Engine service account to configure shared network resources at the host project level for clusters created in service projects

Note: IAM role binding can also be added on a subnet-level permission

Upvotes: 0

intotecho
intotecho

Reputation: 5684

For me, even though the gke serice account existed and had the roles Kubernetes Engine Host Service Agent and Kubernetes Engine Service Agent in the both the service and host projects, I still got the 443 error.

The problem was that the service account needed to have roles/compute.networkUser and roles/compute.instanceAdmin applied to the VPC's subnetwork binding of the VPC.

See: resource google_compute_subnetwork_iam_binding See also module "shared_vpc_access"

Upvotes: 1

Wytrzymały Wiktor
Wytrzymały Wiktor

Reputation: 13878

This is a community wiki answer based on the discussion in the comments and posted for better visibility. Feel free to expand it.

The error you encountered:

Error: googleapi: Error 403: Kubernetes Engine Service Agent is missing required permissions on this project. See Troubleshooting | Kubernetes Engine Documentation | Google Cloud for more info: required “container.hostServiceAgent.use” permission(s) for “projects/project-2”., forbidden

means that the necessary service agent was not created:

roles/container.serviceAgent - Kubernetes Engine Service Agent:

Gives Kubernetes Engine account access to manage cluster resources. Includes access to service accounts.

The official troubleshooting docs describe a solution for such problems:

To resolve the issue, if you have removed the Kubernetes Engine Service Agent role from your Google Kubernetes Engine service account, add it back. Otherwise, you must re-enable the Kubernetes Engine API, which will correctly restore your service accounts and permissions. You can do this in the gcloud tool or the Cloud Console.

The solution above works as in your use case the account was missing so it had to be (re)created.

Upvotes: 2

Related Questions