Reputation: 31
My pipe is executing a COPY INTO command every time a parquet file is loaded into a STAGED location in AWS S3, that's working just fine (the execution).
This is my copy query: (summarized)
copy into table_name
from (
TRY_TO_DATE(
$1:int_field::varchar,
'YYYYMMDD'
) as date_field
from @"stage-location"/path/path2/ (FILE_FORMAT = > c000)
) ON_ERROR = "SKIP_FILE_1%" PATTERN = ".*part.*"
So, I convert $1:int_field (type:int) to VARCHAR (::varchar) and then parse this varchar to DATE in 'YYYYMMDD' format. That works fine for int_field that conform to this format, but when the field is 0, the load fails (only when is executed by the pipe)
When the pipe executed the COPY COMMAND by it self I checked the COPY_HISTORY and got the following error:
Can't parse '0' as date with format 'YYYYMMDD'
And of course the load fails... FAILED LOAD
Here is when the thing gets interesting: when I execute this SAME copy command by myself in the Worksheets, load goes smoothly: OK LOAD
I tried:
VALIDATE, VALIDATION_MODE, VALIDATE_PIPE_LOAD
, but This function does not support COPY INTO statements that transform data during a load, like mine.FILE_FORMAT= (FORMAT_NAME=c000 DATE_FORMAT='YYYYMMDD') ON_ERROR = "SKIP_FILE_1%"
>>> SAME ISSUE, the file's only loaded when I execute the COPY COMMAND with my own hand.Maybe is some SESSION problem or so, I read smthg about DATE_INPUT_FORMAT, but I can't detect the exact problem to solve this.
Can someone help me? Thanks!
Upvotes: 1
Views: 1389
Reputation: 10059
On my tests, I see that it fails all the time (even the stand-alone COPY does not work). On the other hand, querying from the stage file works as expected.
select TRY_TO_DATE(
$1::varchar,
'YYYYMMDD'
) as date_field
from @my_stage; -- works
copy into testing
from (
select
TRY_TO_DATE(
$1::varchar,
'YYYYMMDD'
)
from @my_stage
) ON_ERROR = "SKIP_FILE_1%"; -- fails with "Date '0' is not recognized"
It seems there is an issue with TRY_TO_DATE when running as part of a COPY transformation. By the way, I tested TRY_TO_NUMBER, and it works.
You should submit a case to the Snowflake support, so the development team can investigate the issue.
Upvotes: 1