Naveen Prabhu
Naveen Prabhu

Reputation: 21

Error: The template parameters are invalid. Google Cloud function to run an existing dataflow template GCS_Text_Bigquery

I'm building a Cloud function, with storage trigger and trying to run a dataflow from within. I created a dataflow job "jsonbq-1", with an existing GCS_Text_BigQuery template, have created a simple uDF to take the incoming CSV Data and put into tables. The Cloud function is executed and the Dataflow is called but there is no response, and the error displayed in the log:Error: The template parameters are invalid.

I'm not sure where the error is.

Index.js looks like this.

`const google = require('googleapis'); 
 //const { auth } = require('google-auth-library');

 exports.goWithTheDataFlow  = (event, callback) => {

 const file = event.data;
 const context = event.context;

 console.log(`Event ${context.eventId}`);
 console.log(`  Event Type: ${context.eventType}`);
 console.log(`  Bucket: ${file.bucket}`);
 console.log(`  File: ${file.name}`);
 console.log(`  Metageneration: ${file.metageneration}`);
 console.log(`  Created: ${file.timeCreated}`);
 console.log(`  Updated: ${file.updated}`);
// console.log(err);
//console.log(projectId);
//console.log(authClient);
 google.auth.getApplicationDefault(function (err, authClient, projectId) {
 if (err) {
   throw err;
 }
console.log(err) 
console.log(projectId);

const dataflow = google.dataflow({ version: 'v1b3', auth: authClient });
    console.log(`gs://${file.bucket}/${file.name}`);
   dataflow.projects.templates.create({
  projectId: projectId,
  resource: {

 parameters: {
   inputFile: `gs://${file.bucket}/${file.name}`,
   output_Table: 'titanium-gamma-212906:np_cf_1.cf1',
   //bigQueryLoadingTemporaryDirectory: 'gs://test-bucket-np'
 },
 jobName: 'json-bq1',
 gcsPath: 'gs://dataflow-templates/latest/GCS_Text_to_BigQuery',
 //jobName: 'json-bq1',
 //gcsPath: 'gs://dataflow-templates/latest/GCS_Text_to_BigQuery',
 //output_Table: 'titanium-gamma-212906:np_cf_1.cf1',
 //javascriptTextTransformFunctionName: 'transform',
 //bigQueryLoadingTemporaryDirectory: 'gs://test-bucket-np'
 }
 }, function(err, response) {
 if (err) {
 console.error("problem running dataflow template, error was: ", err);
 }
 console.log("Dataflow template response: ", response);
 callback();
 });

 });

callback();
};`

package.json

"name": "sample-cloud-storage",
  "version": "0.0.1",
  "dependencies": {
    "googleapis": "24.0.0"
  }
}

The LOG:

The LOG

Thanks in advance folks.

Upvotes: 2

Views: 1652

Answers (1)

Sameer Abhyankar
Sameer Abhyankar

Reputation: 261

The GCS_Text_to_Bigquery template has several required parameters that seem to be missing in your call:

  1. "The GCS location of the text you'd like to process" inputFilePattern

  2. "JSON file with BigQuery Schema description" JSONPath

  3. "Output topic to write to" outputTable

  4. "GCS path to javascript fn for transforming output" javascriptTextTransformGcsPath

  5. "UDF Javascript Function Name" javascriptTextTransformFunctionName

  6. "Temporary directory for BigQuery loading process" bigQueryLoadingTemporaryDirectory

Upvotes: 2

Related Questions