Soumitri Pattnaik
Soumitri Pattnaik

Reputation: 3556

Google Cloud PubSub message republishing across GCP projects

Context

I am working on a project where we are getting realtime data on a PubSub topic in a particular GCP project STAGE-1. We have other GCP projects (we are treating as lower level environments) such as DEV-1, QA-1 etc, where we want these message to be re-published to as data realtime data is only hydrating the topic under STAGE-1 GCP project.

Question

P.S. I am very new to PubSub.

Thanks in advance. Cheers :)

Upvotes: 0

Views: 854

Answers (1)

al-dann
al-dann

Reputation: 2725

Here are issues about this =>

There may be at least a possible work around solution.

You will need to create additional subscription(s) to the original topic in the source project. That subscription is to be used by some 'active' component (in any project, subject to IAM permissions to access the given subscription).

The 'active' component can be a cloud function, cloud run, dataflow job, app engine, or something running on a compute engine or running on a k8s cluster...

From my point of view, one of the simplest solutions (but may be not the cheapest, depends on your context) - is to use a streaming dataflow job, which reads from a source subscription and push messages into one or many target topics - kind of a 'bridge'.

If the flow of messages (number of messages per time) is significant, or you need to serve many (dozens or hundreds of) source subscriptions - it can be quite good cost effective solution (from my point of view).

A potential side bonus in case you are to develop bespoke template for the dataflow job - you can implement additional message handling logic inside the dataflow job.

If you need something 'yesterday', no additional transformation is OK, only one source subscription and one target topic, than there is a Google provided template: Pub/Sub to Pub/Sub which can be used 'immediately'.

Upvotes: 3

Related Questions