Reputation: 701
I have a (GCP) Cloud Function that is meant to aggregate hourly data and write to Cloud Bigtable, however it seems to be returning with message: "Function execution took 100ms, finished with status: ok", before completing the full code, subsequent lines sometimes gets run, sometimes not. Would be great if anyone has any experience and can advise on this, thanks!
It works on my local machine when I'm running the script, only in Cloud Functions and I'm not sure what is triggering the termination of the code. I have tried adding in a try/catch block but it did not throw any errors either. Main parts of the code is reproduced below:
const Bigtable = require('@google-cloud/bigtable');
const bigtableOptions = { projectId: process.env.PROJECT_ID };
const bigtable = new Bigtable(bigtableOptions);
const cbt = bigtable.instance(process.env.BIGTABLE_INSTANCE);
const async = require("async");
const moment = require("moment");
require("moment-round");
const bigtableFetchRawDataForDmac = require("./fetchData").bigtableFetchRawDataForDmac;
exports.patchJob = (event, context) => {
const pubsubMsg = Buffer.from(event.data, 'base64').toString();
const jsonMsg = tryParseJSON(pubsubMsg); // msg in format { time: "2018-12-24T02:00:00.000Z", dmac: ["abc", "def", "ghi] }
if(!jsonMsg) return;
else {
if(!jsonMsg.time) {
console.log("Time not provided");
// res.status(400).json({ err: 'TimeNotProvided', msg: `Time parameter is not provided` });
return;
}
let date_range = {};
date_range.lower = moment(jsonMsg.time).toISOString();
date_range.upper = moment(jsonMsg.time).add(1,'hours').subtract(1,"milliseconds").toISOString();
let queryData = [];
let data = {};
for(let i=0; i<jsonMsg.dmac.length; i++){
data[jsonMsg.dmac[i]]=[];
queryData.push(bigtableFetchRawDataForDmac(cbt, jsonMsg.dmac[i], date_range.lower, date_range.upper, data[jsonMsg.dmac[i]]));
}
async.parallel(queryData, function(err, result){
console.log("cookie trail...");
return;
}
}
}
For the bigtableFetchRawDataForDmac it is in a different folder:
function bigtableFetchRawDataForDmac(cbt, dmac, start, end, data) {
return async function(cb){
const table = cbt.table(process.env.BT_DATA_TABLE);
try { var bigtable = await fetchFromBigtable(table, process.env.BT_DATA_TABLE, dmac, start, end, data, ['push', 'mode', 'val']); }
catch (err) { console.log("bigtableFetchRawDataForDmac failed: ", err); cb(err); }
}
}
Upvotes: 1
Views: 2287
Reputation: 701
In Nodejs 8 (Beta) runtime, 3 parameters should be provided (data, context, callback) instead of 2 as provided in the default template in the inline editor of Cloud Functions console. (Documentation reference: https://cloud.google.com/functions/docs/writing/background#functions_background_parameters-node8).
Code should be something like:
exports.patchJob = (event, context, callback) => {
doSomething();
callback(); // To terminate Cloud Functions
}
Thanks @Doug for the hint!
Upvotes: 1
Reputation: 317322
A pubsub Cloud Function receives an event
and callback
parameter. You're supposed to call the callback method to terminate the function when all the work is complete, as is the case for all types of background Cloud Functions.
You've called the callback context
. And you're not using it to terminate the function at all. You may alternatively return a promise that resolves when all the work is complete, but you're not doing that either.
You're going to have to figure out a way to properly terminate your function only after all the async work is complete, or it won't work the way you expect.
Upvotes: 5