Ode
Ode

Reputation: 547

Streaming gzipped file to S3 via AWS SDK GO

I followed the example on the AWS site for gzipping files and streaming them to S3, found here: http://docs.aws.amazon.com/sdk-for-go/latest/v1/developerguide/common-examples.title.html

I am having an issue where the only thing landing in my S3 bucket are files with basically just the GZIP headers. Every single file is 23b in size.

Any idea what would cause this?

My code:

func (t *Table) Upload() {
  year := time.Now().Format("2006")
  month := time.Now().Format("01")
  day := time.Now().Format("02")
  reader, writer := io.Pipe()
  go func() {
    gw := gzip.NewWriter(writer)
    io.Copy(gw, t.File)
    t.File.Close()
    gw.Close()
    writer.Close()
  }()
  uploader := s3manager.NewUploader(session.New(&aws.Config{Region: aws.String(os.Getenv("AWS_REGION"))}))
  result, err := uploader.Upload(&s3manager.UploadInput{
    Body:   reader,
    Bucket: aws.String(os.Getenv("S3_BUCKET")),
    Key:    aws.String(fmt.Sprintf("%s/%s/%s/%s/%s", os.Getenv("S3_KEY"), year, month, day, t.Name+".csv.gz")),
  })
  if err != nil {
    log.WithField("error", err).Fatal("Failed to upload file.")
  }
  log.WithField("location", result.Location).Info("Successfully uploaded to")
}

Upvotes: 2

Views: 2826

Answers (2)

hanjm
hanjm

Reputation: 1

Do you call t.File.Write() in other code? If you do, the cursor of t.File could be the end of file. So you should seek the file cursor to the origin of file.

Call t.File.Seek(0,0) before io.Copy(gw,t.File) (line 9)

Upvotes: 0

Ode
Ode

Reputation: 547

I discovered that even though you may have a struct designed as such (as I do):

type Table struct {                                                                                                                                                                                                                                                                                                           
  Name     string                                                                                                                                                                                                                                                                                                             
  Path     string                                                                                                                                                                                                                                                                                                             
  FileName string                                                                                                                                                                                                                                                                                                             
  File     *os.File                                                                                                                                                                                                                                                                                                           
  Buffer   *bufio.Writer                                                                                                                                                                                                                                                                                                      
  Data     chan string                                                                                                                                                                                                                                                                                                        
}

using a function that requires a pointer to that struct does not necessarily leave the Table.File in an open state.

I made sure the file was closed when writing to it was complete and reopened it inside my upload function. This resolved the issue and uploaded the full gzipped file to S3.

Thanks for the heads up on the possible issue @jrwren

Upvotes: 1

Related Questions