Reputation: 532
(Posting this Q&A sequence in SO as I'm sure some newer users may be running into similar "speed bumps". -G)
How to handle special characters in Snowpipe
I am creating snowpipe based on csv file. my csv file contain special characters in few columns. Please let me know how to write select statement in snowpipe that will take care of any special characters.
Upvotes: 3
Views: 10487
Reputation: 990
To handle special characters you need to escape them.
There are 2 ways to escape special characters, but unfortunately each of them requires you to modify the file
1) you can escape a special character by duplicating it (so to escape a ' you make it a '')
2) when defining your file format you can add ESCAPE
parameter to define an explicit escape character. For example, you could use ESCAPE='\\'
and then add a single \
character before each of the special characters you want to escape.
Upvotes: 3
Reputation: 532
The snowpipe command embeds the copy statement which can contain a select statement (transformation), and we can use string functions to remove the special character.
/* Data load with transformation */
COPY INTO [<namespace>.]<table_name> [ ( <col_name> [ , <col_name> ... ] ) ]
FROM ( SELECT [<alias>.]$<file_col_num>[.<element>] [ , [<alias>.]$<file_col_num>[.<element>] ... ]
FROM { internalStage | externalStage } )
[ FILES = ( '<file_name>' [ , '<file_name>' ] [ , ... ] ) ]
[ PATTERN = '<regex_pattern>' ]
[ FILE_FORMAT = ( { FORMAT_NAME = '[<namespace>.]<file_format_name>' |
TYPE = { CSV | JSON | AVRO | ORC | PARQUET | XML } [ formatTypeOptions ] } ) ]
The above command is part of following url "https://docs.snowflake.net/manuals/sql-reference/sql/copy-into-table.html".
NOTE: I'd be interested to see if anyone else has utilized an alternative solution with success... -G
Upvotes: 0