Reputation: 19
I have CSV files in the S3 bucket, which I am loading into external stages and then creating stored proc using merge statement, because data just need to insert or update.
but my s3 bucket key/password changes every 3 months due to company policy.
in this case, I need to manually go and create my external stage every time. any other way I can load the data where there is no need to change the key/password every time.
please help.
Upvotes: 1
Views: 643
Reputation: 175706
An alternative to explicitly providing key/password is creation of a STORAGE INTEGTATION:
Creates a new storage integration in the account or replaces an existing integration.
A storage integration is a Snowflake object that stores a generated identity and access management (IAM) entity for your external cloud storage, along with an optional set of allowed or blocked storage locations (Amazon S3, Google Cloud Storage, or Microsoft Azure). Cloud provider administrators in your organization grant permissions on the storage locations to the generated entity.This option allows users to avoid supplying credentials when creating stages or when loading or unloading data.
Configuring a Snowflake Storage Integration to Access Amazon S3
This section describes how to use storage integrations to allow Snowflake to read data from and write data to an Amazon S3 bucket referenced in an external (i.e. S3) stage. Integrations are named, first-class Snowflake objects that avoid the need for passing explicit cloud provider credentials such as secret keys or access tokens. Integration objects store an AWS identity and access management (IAM) user ID. An administrator in your organization grants the integration IAM user permissions in the AWS account.
An integration can also list buckets (and optional paths) that limit the locations users can specify when creating external stages that use the integration.
Upvotes: 2