KLA
KLA

Reputation: 31

Integrate a Bigquery procedure into DBT

I am working on building a pipeline in DBT. I have a stored procedure in Bigquery that I used to run in big query in an exploratory way.

The procedure is called using in BQ: CALL project_id.my_procedure(table_name , column_name) The procedure runs two dynamic queries (based on table and column names given in input).

SELECT * FROM the table created in step1.

enter image description here I would like to run this procedure in DBT by giving it a name of a view/ table in DBT and use the output (the temporary table) in the flow.

Is this possible, I read about using Macros but I didn't figure out how to capture the output.

Thanks in advance for your help

Upvotes: 0

Views: 1488

Answers (2)

KLA
KLA

Reputation: 31

We choose to code the content of the procedure in DBT. I used a macro that other analysts could call in their pipelines.

I think it is the best option as we still have the lineage.

Upvotes: 0

Rameez Khan
Rameez Khan

Reputation: 1

You could use dbt vars (https://docs.getdbt.com/reference/dbt-jinja-functions/var#variable-default-values) in the models you want to create.

So model_a will perform a search operation and output a table and run the insert to log table with a dbt post-hook (https://docs.getdbt.com/reference/resource-configs/pre-hook-post-hook).

Then you can run model_b which will have a reference to model_a that will execute afterwards. So the dbt command will look something like:

dbt run model_a model_b --vars='{table_name: table_a, column_name: column_a}'

Upvotes: 0

Related Questions