TomLenzmeier
TomLenzmeier

Reputation: 49

The S3 bucket addressed by the query is in a different region from this cluster

I have just begun my travels on the Redshift road. I am trying to load multiple GZIP files using the copy command and adding a manifest. The S3 bucket and the cluster are both in the same region. My manifest file is a txt file with JSON. Either my syntax is wrong or something else is off. Thanks!

{

"entries": [
    
    {
    "url":"s3://abacosdw-load-queue/sales1.txt.gz", "mandatory":true
    },
    {
    "url":"s3://abacosdw-load-queue/sales2.txt.gz", "mandatory":true
    },
    {
    "url":"s3://abacosdw-load-queue/sales3.txt.gz", "mandatory":true
    },
    {
    "url":"s3://abacosdw-load-queue/sales4.txt.gz", "mandatory":true
    }

]

}

copy factsales from 's3://mydw-load-queue/manifest.txt' iam_role 'arn:aws:iam::123456789:role/RedshiftS3' region 'us-west-2' delimiter '|' GZIP manifest

Upvotes: 2

Views: 6931

Answers (3)

Karthikeyan VK
Karthikeyan VK

Reputation: 6006

you can try the following command.

copy tablename from output.csv iam_role arn:was:xxxxx REGION 'regionName';

Make sure the regionName is in quotes

Upvotes: 0

VISHNU VARDHAN
VISHNU VARDHAN

Reputation: 1

I think the bucket which you have mentioned in the manifest file and the bucket which you are giving in the copy command are in different region please create them in the same region.

Upvotes: 0

BigData-Guru
BigData-Guru

Reputation: 1261

For each COPY command, do the following:

Replace with the name of a bucket in the same region as your cluster.

This step assumes the bucket and the cluster are in the same region. Alternatively, you can specify the region using the REGION option with the COPY command.

Refer this

Upvotes: 3

Related Questions