Faisal Shani
Faisal Shani

Reputation: 810

Manually flush data from table streams in Snowflake

I am using table stream for my ETL pipeline. I am just wondering is there any possibility of flushing the data from table stream manually without saving it anywhere in other table?

Upvotes: 5

Views: 4324

Answers (2)

Greg Pavlik
Greg Pavlik

Reputation: 11046

Snowflake will advance the stream when it sees the stream after FROM in a DML statement. It doesn't care how or if you use the rows, so you can do this:

insert into JUNK_TABLE select ANY_COLUMN from MY_STREAM where false;

If you run just this part of the insert, you can see that nothing will get inserted:

select ANY_COLUMN from MY_STREAM where false;

The where clause will evaluate to false for every row because that's what it's set to return. This means the insert statement won't insert a single row, but it will consume the stream.

Here's a mini script to test this:

-- Quick setup:
create or replace table MY_TABLE(COL1 varchar);
create or replace stream MY_STREAM on table MY_TABLE;

--Create a junk table so the syntax works:
create table JUNK_TABLE like MY_TABLE;

insert into MY_TABLE(COL1) values ('Row1'), ('Row2'), ('Row3');

select * from MY_STREAM; --The stream has three rows

insert into JUNK_TABLE select COL1 from MY_STREAM where false;  --Consume the stream

select * from MY_STREAM; -- The stream has no rows

select * from JUNK_TABLE; -- Neither does the junk table because "where false" on the insert

Upvotes: 7

Francesco Quaratino
Francesco Quaratino

Reputation: 590

Have you considered recreating the Stream with a CREATE OR REPLACE STREAM ... ? If you had to do thatperiodically, you could also make it part of a stored procedure to run within a scheduled Task.

Upvotes: 3

Related Questions