Visrut
Visrut

Reputation: 671

How do you debug GRPC error when there is no details?

I have already found a discussion regarding better debugging for GRPC: How to debug grpc call?

but I am running into a scenario where even after adding those flags GRPC_VERBOSITY and GRPC_TRACE, I couldn't know the actual error because details are empty!

I am trying to create data transfer from S3 to Big Query but using a programmatic way, so this is what I wrote by first going through some stuff here: https://github.com/googleapis/googleapis/blob/master/google/cloud/bigquery/datatransfer/v1/transfer.proto and a little bit of AI help.

import * as bigQueryDataTransfer from '@google-cloud/bigquery-data-transfer';

const client = new bigQueryDataTransfer.DataTransferServiceClient();

client
  .createTransferConfig({
    parent: `projects/project-name`,
    transferConfig: {
      destinationDatasetId: 'dataset-id',
      displayName:'Test Transfer',
      dataSourceId: 'amazon_s3',
      schedule: 'every 3 hours',
      params: {
        fields: {
          bucket_name: {
            stringValue: 'bucket-name',
          },
          object_path: {
            stringValue: 'path/to/your/data/*',
          },
          aws_access_key_id: {
            stringValue: 'access-key',
          },
          aws_secret_access_key: {
            stringValue: 'secret-key',
          },
          region_name: {
            stringValue: 'us-west-1',
          },
        },
      },
    },
  })

After running above code I am running into this kind of error

Error: 3 INVALID_ARGUMENT: 
...
...
    at process.processTicksAndRejections (node:internal/process/task_queues:105:5) {
  code: 3,
  details: '',
  metadata: Metadata { internalRepr: Map(0) {}, options: {} }
}

Now the problem is that the details field is empty and I can't figure out where the actual error is happening, even in the verbose output it doesn't reveal anything details is still empty

D 2024-12-26T13:37:51.639Z | v1.12.5 26468 | subchannel_call | [3] received status code 3 from server
D 2024-12-26T13:37:51.639Z | v1.12.5 26468 | subchannel_call | [3] ended with status: code=3 details=""
D 2024-12-26T13:37:51.640Z | v1.12.5 26468 | load_balancing_call | [2] Received status
D 2024-12-26T13:37:51.640Z | v1.12.5 26468 | load_balancing_call | [2] ended with status: code=3 details="" start time=2024-12-26T13:37:50.437Z
D 2024-12-26T13:37:51.640Z | v1.12.5 26468 | retrying_call | [1] Received status from child [2]

So I want to know how you folks debug this kind of error. There is not much documentation regarding how to perform S3 transfer using npm SDK that's why I have to do this try-and-error play, also couldn't get much information from .profo files from Github.

Don't get me wrong sometimes it does provide useful errors at one point I entered the dataset ID wrong then it gave me a detailed message like dataset can not be found but sometimes it provides just an empty message with Error code 3: INVALID_ARGUMENT, but which argument?

Upvotes: 0

Views: 45

Answers (1)

Visrut
Visrut

Reputation: 671

Ok, this is not an official answer on how to debug when the detailed message is empty, but I have to say the Python library has some really good stuff for sure and I wrote my working node code after writing in Python first.

So When I specified a field called unknown_stuff it just actually gave me a good error message that 400 Data source definition doesn't define this parameter Id: unknown_stuff

from google.cloud import bigquery_datatransfer
from google.protobuf import struct_pb2
from google.oauth2 import service_account

credentials = service_account.Credentials.from_service_account_file(
    "your-credentials.json"
)

client = bigquery_datatransfer.DataTransferServiceClient(credentials=credentials)

parent = f"projects/your-google-project-name"

params = struct_pb2.Struct()
params.update({
    "destination_table_name_template": "destination-table-name",
    "data_path": "s3://path/to/files/*",
    "access_key_id": 'access-key',
    "secret_access_key": 'secret-key',
    "file_format": "JSON",
    "ignore_unknown_values": True,
    "write_disposition": "WRITE_APPEND",
    "unknown_stuff": "What is this"   ### <--- This is not valid and got the error nice
})

transfer_config = bigquery_datatransfer.TransferConfig(
    destination_dataset_id="your-dataset-id",
    display_name=f"Data transfer for XYZ stuff",
    data_source_id="amazon_s3",
    params=params,
    schedule="every 3 hours"
)

try:
    response = client.create_transfer_config(
        request={
            "parent": parent,
            "transfer_config": transfer_config
        }
    )
    print(f"Created transfer config: {response.name}")
except Exception as e:
    print(f"Error creating transfer config: {str(e)}")

In the case of node, I didn't get this kind of friendly error message, but I should be getting an error called object_path is not valid. Now Eventually I wrote working code in node but all the stuff I had to derive was from Python so there is a plus point there.

One more thing I liked is the documentation reference in the Python client is good, for example, compare the below 2 links, I really liked the Python docs for sure.

Node.JS: https://cloud.google.com/nodejs/docs/reference/bigquery-data-transfer/latest/bigquery-data-transfer/protos.google.cloud.bigquery.datatransfer.v1.itransferconfig

Python: https://cloud.google.com/python/docs/reference/bigquerydatatransfer/latest/google.cloud.bigquery_datatransfer_v1.types.TransferConfig

For those who are wondering about the code I wrote in node afterward check this:

import * as bigQueryDataTransfer from '@google-cloud/bigquery-data-transfer';

process.env.GOOGLE_APPLICATION_CREDENTIALS = './google-service-file.json';

const client = new bigQueryDataTransfer.DataTransferServiceClient();

const transferConfig = {
  destinationDatasetId: 'dataset-id',
  displayName: 'Data transfer for Table Name',
  dataSourceId: 'amazon_s3',
  params: {
    destination_table_name_template: 'table-name',
    data_path: 's3://path/to/files/*',
    access_key_id: 'access-key',
    secret_access_key: 'secret-key',
    file_format: 'JSON',
    ignore_unknown_values: true,
    write_disposition: 'WRITE_APPEND',
  },
  schedule: 'every 3 hours',
};

const parent = 'projects/project-name';

const [response] = await client.createTransferConfig({
  parent: parent,
  transferConfig: {
    displayName: transferConfig.displayName,
    dataSourceId: transferConfig.dataSourceId,
    destinationDatasetId: transferConfig.destinationDatasetId,
    schedule: transferConfig.schedule,
    params: {
      fields: {
        destination_table_name_template: {
          stringValue: transferConfig.params.destination_table_name_template,
        },
        data_path: {
          stringValue: transferConfig.params.data_path,
        },
        access_key_id: {
          stringValue: transferConfig.params.access_key_id,
        },
        secret_access_key: {
          stringValue: transferConfig.params.secret_access_key,
        },
        file_format: {
          stringValue: transferConfig.params.file_format,
        },
        ignore_unknown_values: {
          boolValue: transferConfig.params.ignore_unknown_values,
        },
        write_disposition: {
          stringValue: transferConfig.params.write_disposition,
        },
      },
    },
  },
});

Upvotes: 0

Related Questions