hashaf
hashaf

Reputation: 11

HttpError 403 when requesting Cloud SQL Admin API Client Library

I'm running a cloud function across 2 different projects. The cloud function is stored in project A and the MySQL instance is in project B.

In order for the cloud function in project A to access the cloud instance and perform import jobs in project B, I've added the service account of the cloud function as a principal in project B and assign the service account with the roles "Cloud SQL Client" and  "Cloud SQL Editor" in project B. I followed this guide here: https://www.cloudquery.io/blog/creating-cross-project-service-accounts-in-gcp

However, when I run the function in project A, I still get this error: 

googleapiclient.errors.HttpError: <HttpError 403 when requesting https://sqladmin.googleapis.com/sql/v1beta4/projects/astute-coda-410816/instances/test_database/import?alt=json returned "The client is not authorized to make this request.". Details: "[{'message': 'The client is not authorized to make this request.', 'domain': 'global', 'reason': 'notAuthorized'}]">

This is the code I used:

import sqlalchemy
from googleapiclient import discovery
import google.auth
import functions_framework
from google.cloud import storage

storage_client = storage.Client()
credentials, project = google.auth.default()
service = discovery.build('sqladmin', 'v1beta4', credentials=credentials)

#mysql connection
def getconn():
    connector = Connector()
    conn = connector.connect("astute-coda-410816:europe-west2:test_database",
        "pymysql",
        user=os.environ["DB_USER"],
        password=os.environ["DB_PASS"], 
        db="product") 
    return conn

pool = sqlalchemy.create_engine("mysql+pymysql://",creator=getconn,)

def move_data(table: str) -> None:
    #list files that need to be imported from gcs
    bucket = storage_client.get_bucket('bucket_data')
    blobs = bucket.list_blobs(prefix=table)
    files = []
    for blob in blobs:
        files.append(blob.name)

    #import files into mysql
    for i in files:
        instances_import_request_body = {
        "importContext": {
        "uri": f"gs://bucket_data/{i}",
        "kind": "sql#importContext",
        "database": "product",
        "fileType": "CSV",
        "csvImportOptions": {"table": table}
        } 
        }  
        request = service.instances().import_(project="astute-coda-410816", instance="test_database", body=instances_import_request_body)
        response = request.execute()
        process_status = False
        while not process_status:
            resp = service.operations().get(project="astute-coda-410816", operation=response['name']).execute()
            if resp['status'] == "DONE":
                print("finish")
                process_status = True

    
@functions_framework.cloud_event
def transfer(cloudevent):
    #get status of Event method google.cloud.bigquery.v2.JobService.InsertJob from Cloud Audit Log
    payload = cloudevent.data.get("protoPayload")
    status = payload.get("status")
    if not status: #if status is empty, job was successful and files can be copied over to mysql
        move_data("product_data")

Anyone knows why?

Upvotes: 1

Views: 249

Answers (1)

guillaume blaquiere
guillaume blaquiere

Reputation: 75940

The API activation was my first guess but not.

By reading more carefully the error, you are trying to perform an import. This operation require the cloudsql.instances.import permission, that is only in the Cloud SQL Admin role (or the project editor role -> a bad practice, don't use this role)

More detail in the documentation https://cloud.google.com/sql/docs/mysql/iam-roles

(read the Cloud SQL editor role, it's clearly mentioned that import does not work with it)

Upvotes: 2

Related Questions