Reputation: 147
COPY INTO mycsvtable
FROM @my_csv_stage/tutorials/dataloading/
PATTERN='.*test.dat'
on_error = 'skip_file';
COPY INTO mycsvtable
FROM @my_csv_stage/tutorials/dataloading/
PATTERN='.*test.*[.]dat'
on_error = 'skip_file';
I have multiple files named test in AWS S3 Bucket. I'm trying to copy those files to Snowflake table. When I run the COPY command, it's getting failed. I'm getting the error stating - Copy executed with 0 files processed.
Tried the above scripts. Looks like I need to make changes in the pattern. Any suggestions please?
Upvotes: 1
Views: 1636
Reputation: 10039
If you want to ingest all files in the stage, you don't need to use pattern clause - you can just remove it.
If you need to specify the files by pattern, list the files and make sure that you have the files:
ls @my_csv_stage/tutorials/dataloading/
Lastly, maybe these files are already ingested, so you can also try using "FORCE = TRUE" option.
COPY INTO mycsvtable
FROM @my_csv_stage/tutorials/dataloading/
on_error = 'skip_file' force=true;
Upvotes: 1