Jeff
Jeff

Reputation: 449

Snowflake - Putting large file into internal Snowflake Stage

I am currently trying to upload a large, unzipped, CSV file into an internal snowflake stage. The file is 500 gb. I ran the put command, but it doesn't look like much is happening. There is no status update, it's just kind of hanging there.

Any ideas what's going on here? Will this eventually time out? Will it complete? Anyone have an estimated time?

I am tempted to try and kill it somehow. I am currently splitting the large 500 gb file up into about 1000 smaller files that I'm going to zip up and upload in parallel (after reading more on best practices).

Upvotes: 1

Views: 2629

Answers (2)

Sriga
Sriga

Reputation: 1321

Per snowflake suggestion please split the file into multiple small file, then stage your file into snowflake internal stage.(By default snowflake will compress file)

Then try run copy command with multi-cluster warehouse, Then you will see the performance of snowflake.

Upvotes: 1

PaulHoran
PaulHoran

Reputation: 59

Unless you've specified auto_compress=FALSE, then step 1 in the PUT is compressing the file, which may take some time on 500GB...
Using parallel=<n> will automatically split the files into smaller chunks and upload them in parallel - you don't have to split the source file yourself. (But you can if you want to...)

Upvotes: 2

Related Questions