Phi Nguyen
Phi Nguyen

Reputation: 41

Google Cloud Pub/Sub Trigger on Google Images

We need a way to automatically create a Pub/Sub trigger on new compute images (preferably triggered on a specific image family). Alternatively, we know that Pub/Sub on GCS buckets, but we have not found a way to automate transferring images to a GCS bucket.

For some background: we are automating image baking through packer and we need this piece to trigger a terraform creation. We know that a cron job can be created to simply poll images when they are created, but we are wondering if there is already support for such a trigger in GCP.

Upvotes: 1

Views: 919

Answers (2)

Saby
Saby

Reputation: 41

Another solution would be to create a cloud function with --trigger-topic={your pub sub topic} and then filter only the images that you want to act on based on some environment variables on the cloud function

Psuedo code 1. create a pub sub topic for images being inserted in the GCR

    gcloud pubsub topics create projects/<project_id>/topics/gcr
  1. This will now publish all messages corresponding to all images being inserted/modified/deleted in the repo
  2. Create a cloud function that has the function signature thus
// contents of index.js
// use the Storage function from google-coud node js api to work on storages
// https://www.npmjs.com/package/@google-cloud/storage

const Storage = require(@google-cloud/storage).Storage;

function moveToStorageBucket(pubSubEvents, context, callback) {
 /* this is how the pubsub comes from GCR
{"data":{"@type":"... .v1.PuSubMessage", "attribute":null, "data": "<base 64 encoded>"},
 "context":{..other details}}
data that is base 64 encoded in in this format
{ "action":"INSERT","digest":"<image name>","tag":<"tag name>"}
*/
    const data = JSON.parse(Buffer.from(pubSubEvents.data, 'base64').toString())
    // get image name from the environment variable passed
    const IMAGE_NAME = process.env.IMAGE_NAME;
    if (data.digest.indexOf(IMAGE_NAME) !== -1) {
     // your action here...
    }
}
module.exports.moveToStorageBucket = moveToStorageBucket;
  1. deploy the cloud function
gcloud functions deploy <function_name> --region <region> --runtime=nodejs8 --trigger-topic=<topic created> --entry-point=moveToStorageBucket --set-env-vars=^--^IMAGE_NAME=<your image name>

Hope that helps

Upvotes: 1

Guillem Xercavins
Guillem Xercavins

Reputation: 7058

You can have a Stackdriver Logging export sink that publishes to Pub/Sub and is triggered by a specific filter (docs). For example:

resource.type="gce_image"
jsonPayload.event_subtype="compute.images.insert"
jsonPayload.event_type="GCE_OPERATION_DONE"

enter image description here

To trigger it only for a specific family you can use this other filter below but protoPayload.request.family is only present when the API request is received and not when it is actually fulfilled (maybe you could add some delay in your processing function if needed)

resource.type="gce_image"
protoPayload.request."@type"="type.googleapis.com/compute.images.insert"
protoPayload.request.family="FAMILY-NAME"

Upvotes: 2

Related Questions