marsolmos
marsolmos

Reputation: 794

How to read pickle file from AWS S3 nested directory?

I have a pickle file in a nested directory inside AWS S3 Bucket, but I'm not able to load it with boto3 library to use it with AWS Lambda.

I've tried to follow the answers to this question, but no one works. This is my code so far:

s3 = boto3.resource('s3')
source_bucket = "source_bucket_name"
key = "folder1/pickle_file.p"
response = s3.Bucket(source_bucket).Object(key).get()
body_string = response['Body'].read()
try:
    loaded_pickle = pickle.loads(body_string)
except Exception as e:
    print(e)

EDIT

When loading this function into AWS, I'm getting the following error message:

embedded null byte

Upvotes: 1

Views: 3029

Answers (1)

Marcin
Marcin

Reputation: 238131

Your code seems fine apart from what I said in the comments. Maybe your upload to S3 is incorrect then. below is full working example:

import pickle
import boto3

mylist = [1,2,3]

# create pickle file

with open('/tmp/pickle_file.p', 'wb') as f:
  pickle.dump(mylist, f)

# upload to s3

source_bucket='source_bucket_name'
key = "folder1/pickle_file.p"

with open('/tmp/pickle_file.p', 'rb') as f:

  response = boto3.client('s3').put_object(
      Body=f,
      Bucket=source_bucket,
      Key=key)

  print(response)

# read back from s3

s3 = boto3.resource('s3')
response = s3.Bucket(source_bucket).Object(key).get()

body_string = response['Body'].read()

try:
    loaded_pickle = pickle.loads(body_string)
except Exception as e:
    print(e)

# should print out `mylist`
print(loaded_pickle)  

Upvotes: 1

Related Questions