the_dummy
the_dummy

Reputation: 347

How to save sklearn model on s3 using joblib.dump?

I have a sklearn model and I want to save the pickle file on my s3 bucket using joblib.dump

I used joblib.dump(model, 'model.pkl') to save the model locally, but I do not know how to save it to s3 bucket.

s3_resource = boto3.resource('s3')
s3_resource.Bucket('my-bucket').Object("model.pkl").put(Body=joblib.dump(model, 'model.pkl'))

I expect the pickled file to be on my s3 bucket.

Upvotes: 19

Views: 16341

Answers (4)

nbeuchat
nbeuchat

Reputation: 7091

You can also use the s3fs library.

import joblib
import s3fs
import os

# Write
fs = s3fs.S3FileSystem()
output_file = os.path.join("s3://...", "model.joblib")

with fs.open(output_file, 'wb') as f:
    joblib.dump(clf, f) 

# Read
with fs.open(output_file, 'rb') as f:
    clf = joblib.load(f)

Upvotes: 7

Alexei Andreev
Alexei Andreev

Reputation: 675

Here's a way that worked for me. Pretty straight forward and easy. I'm using joblib (it's better for storing large sklearn models) but you could use pickle too.
Also, I'm using temporary files for transferring to/from S3. But if you want, you could store the file in a more permanent location.

import tempfile
import boto3
import joblib

s3_client = boto3.client('s3')
bucket_name = "my-bucket"
key = "model.pkl"

# WRITE
with tempfile.TemporaryFile() as fp:
    joblib.dump(model, fp)
    fp.seek(0)
    s3_client.put_object(Body=fp.read(), Bucket=bucket_name, Key=key)

# READ
with tempfile.TemporaryFile() as fp:
    s3_client.download_fileobj(Fileobj=fp, Bucket=bucket_name, Key=key)
    fp.seek(0)
    model = joblib.load(fp)

# DELETE
s3_client.delete_object(Bucket=bucket_name, Key=key)

Upvotes: 18

Abhi
Abhi

Reputation: 33

Just correcting Sayali Sonawane's answer:

import tempfile
import boto3
s3 = boto3.resource('s3')

# you can dump it in .sav or .pkl format 
location = 'folder_name/' # THIS is the change to make the code work
model_filename = 'model.sav'  # use any extension you want (.pkl or .sav)
OutputFile = location + model_filename

# WRITE
with tempfile.TemporaryFile() as fp:
    joblib.dump(scikit_learn_model, fp)
    fp.seek(0)
    # use bucket_name and OutputFile - s3 location path in string format.
    s3.Bucket('bucket_name').put_object(Key= OutputFile, Body=fp.read())

Upvotes: 1

Sayali Sonawane
Sayali Sonawane

Reputation: 12599

Use following code to dump your model to s3 location in .pkl or .sav format:

import tempfile
import boto3
s3 = boto3.resource('s3')

# you can dump it in .sav or .pkl format 
location = 's3://bucket_name/folder_name/'
model_filename = 'model.sav'  # use any extension you want (.pkl or .sav)
OutputFile = location + model_filename

# WRITE
with tempfile.TemporaryFile() as fp:
    joblib.dump(scikit_learn_model, fp)
    fp.seek(0)
    # use bucket_name and OutputFile - s3 location path in string format.
    s3.Bucket('bucket_name').put_object(Key= OutputFile, Body=fp.read())

Upvotes: 4

Related Questions