Reputation: 34
I am more curious to understand the Snowflake process to store the data into Micro-partitions. As Far as i know, snowflake each partition size would be 50-500MB.
Suppose I have a file size worth 1GB and i wanted to load this data into snowflake. Can some one explain me the internal process/steps snowflake does to store the data into micro partitons
Upvotes: 0
Views: 587
Reputation: 426
To optimize the number of parallel operations for a load, Snowflake recommends files roughly 10 MB to 100 MB in size, compressed. Splitting large files into a greater number of smaller files distributes the load among the servers in an active warehouse and increases performance.
Upvotes: 0
Reputation: 6229
Snowflake's micro-partition file format is proprietary so you're not going to get much more information than is already in the documentation (short of someone breaching their employment contract from Snowflake).
Upvotes: 4