Reputation: 21
I am having a problem with Python 2.7 and boto3 for writing files to an S3 bucket. Specifically, when I write to a file on my EC2 instance, close it, and then try to write the new file to an S3 bucket I see that a file is written, but it is empty (0 bytes). Here's the code snippet:
!/usr/bin/python
import boto3
newfile = open('localdestination','w')
newfile.write('ABCDEFG')
newfile.close
fnamebuck = 'bucketdestination'
client = boto3.client('s3')
inptstr = 'localdestination'
client.upload_file(inptstr, 'bucketname', fnamebuck)
I have tried modifying permissions, adding a delay after the file is closed, changing my variable names, and a variety of code alterations, but to no avail. I don't receive any error messages. Any ideas what's wrong with this S3 bucket write?
Upvotes: 2
Views: 7775
Reputation: 13176
Don't use plain open in python. it is anti-pattern and difficult to spot the mistake. Always use "with open()". When within the with context, python will close the file for you (and flush everything), so there is no surprises.
Please check this out Not using with to open file
import boto3
inptstr = 'localdestination'
with open(inptstr,'w') as newfile:
newfile.write('ABCDEFG')
fnamebuck = 'bucketdestination'
s3 = boto3.client('s3')
s3.upload_file(inptstr, 'bucketname', fnamebuck)
Upvotes: 7
Reputation: 1692
From your code it seems like you do not invoke the close() function, you are missing ()
!/usr/bin/python
import boto3
newfile = open('localdestination','w')
newfile.write('ABCDEFG')
newfile.close() # <---
fnamebuck = 'bucketdestination'
client = boto3.client('s3')
inptstr = 'localdestination'
client.upload_file(inptstr, 'bucketname', fnamebuck)
Upvotes: 1