diptiranjan pradhan
diptiranjan pradhan

Reputation: 81

Ran out of memory during Redshift UNLOAD operation

This is the command I'm using:

unload ('select * from SPEC_BFO.CASE_HISTORY where (INTEGRATION_ID,LAST_OPERATION_DATE) IN (SELECT INTEGRATION_ID,max(LAST_OPERATION_DATE) from SPEC_BFO.CASE_HISTORY group by INTEGRATION_ID)') to 's3://use-s3-dwnam-qa/NAM/SPEC_BFO/CASE_HISTORY/VIEW_CASE_HISTORY/VIEW_CASE_HISTORY.' iam_role 'arn:aws:iam::111111111111:role/use-redshift-dwnam-qa' delimiter '|' PARALLEL OFF header ALLOWOVERWRITE gzip;

Getting this error:

('There is a problem:', InternalError('S3 Query Exception (Fetch)\nDETAIL:  \n  -----------------------------------------------\n  error:  S3 Query Exception (Fetch)\n  code:      15001\n  context:   Request ran out of memory in the S3 query layer.\n  query:     8163346\n  location:  dory_util.cpp:1083\n  process:   asyncrequest_thread [pid=112556]\n  -----------------------------------------------\n\n',))

Remote Script invocation failed, this script will halt.

Upvotes: 0

Views: 432

Answers (1)

John Rotenstein
John Rotenstein

Reputation: 269400

It is likely that the query is too complex to combine with an UNLOAD operation.

Try creating the output table first by using a CREATE TABLE AS command, then UNLOAD that table via SELECT *.

Upvotes: 1

Related Questions