Reputation: 188
This article explains how you can execute a pipeline written in java via a Cloud Function. However, I'm trying to accomplish this with a pipeline written in python.
I'm able to do this successfully when executing a local Cloud Function using a virtualenv environment for python. This is before being packaged up as a zip.
exports.foo = function(event, callback) {
var spawn = require('child_process').spawn;
var child = spawn(
'ENV/bin/python',
["pipeline.py",
"--project $PROEJCT_ID",
"--temp_location gs://$BUCKET/temp",
"--staging_location gs://$BUCKET/staging",
"--runner DataflowRunner"],
{cwd: __dirname}
);
child.stdout.on('data', (data) => {
console.log(`stdout: ${data}`);
});
child.stderr.on('data', (data) => {
console.log(`stderr: ${data}`);
});
child.on('close', (code) => {
console.log(`child process exited with code ${code}`);
callback();
});
};
Though, when I do the actual deployment of the Function to GCP and run from there, the pipeline never executes.
Any insight on this would be appreciated.
Below is from the logs when running a deployed Function:
D foo vxvt93uc415v 2017-03-05 00:56:43.639 Function execution started
D foo vxvt93uc415v 2017-03-05 00:56:57.945 Function execution took 14308 ms, finished with status: 'ok'
UPDATE:
There was an error that I wasn't logging out correctly:
ENV/bin/python is not a supported ELF or interpreter script
I've reached out to the Cloud Functions team who then filed a bug report.
Upvotes: 4
Views: 872
Reputation: 2114
I had the same problem with a binary that had been compiled for MacOS. The cloud functions container uses Debian, and linux and MacOS executables are not compatible (see https://stackoverflow.com/a/9439548).
I installed the Cloud functions docker container, and downloaded and compiled the binary that I needed inside the container, then copied it out of the container and deployed it with the cloud function. But you may be able to get away with just using a precompiled Debian-compatible python interpreter in your case.
Upvotes: 1