AJR
AJR

Reputation: 589

Uploading Log file to S3 bucket?

I have a program in python and its working great. However, I decided to do some logging to track the progress and output the steps in a log file. I'm having issues since its my first time using the logging python library. The goal is to log the steps in a file and upload it to S3. What am I missing, please review code below?

start_time = time.time()
logging.basicConfig(filename='myLogFile.log', format='%(asctime)s %(levelname)s %(name)s: %(message)s', datefmt='%d-%b-%y %H:%M:%S', level=logging.INFO)
logger = logging.getLogger("GlueJob")
logging.info("Program started ....")
logger.setLevel(logging.INFO)
log_stringio = io.StringIO()
handler = logging.StreamHandler(log_stringio)
logger.addHandler(handler)

 Do Something .................
logging.info("List has all objects from S3 ... good")

Do Something ........................
logging.info("All created lists are populated with elements from S3 ... good")

DO Something ...........................
logging.info("Dictionary and Dataframe has been created ... good")

Do Something .......................
logging.info("Convert dataframe to csv ... good")

# here is the problem ....... Logfile is not uploading to S3 ### What am I missing??
s3.Bucket('my-bucket').upload_file(Filename='myLogFile.log', Key='/Asset_Filename_Database/folder1/folder2/myLogFile.log')

 print("Process Finsihed --- %s seconds ---" %(time.time() - start_time))

Thank you !!!

Upvotes: 1

Views: 3627

Answers (1)

KayD
KayD

Reputation: 826

It creates an no-name folder when you use / in the keyname Key='/Asset_Filename_Database/, please use Key='Asset_Filename_Database instead.


I tried to run this example with all three instances (object, client and bucket), it works for me.

import logging
import io
import time
import boto3

start_time = time.time()
logging.basicConfig(filename='myLogFile.log', format='%(asctime)s %(levelname)s %(name)s: %(message)s', datefmt='%d-%b-%y %H:%M:%S', level=logging.INFO)
logger = logging.getLogger("GlueJob")
logging.info("Program started ....")
logger.setLevel(logging.INFO)
log_stringio = io.StringIO()
handler = logging.StreamHandler(log_stringio)
logger.addHandler(handler)

logging.info("List has all objects from S3 ... good")
logging.info("All created lists are populated with elements from S3 ... good")
logging.info("Dictionary and Dataframe has been created ... good")
logging.info("Convert dataframe to csv ... good")

s3 = boto3.resource('s3')
s3_client = boto3.client('s3')

s3_client.upload_file('myLogFile.log', 'test-kayd-bucket', 'client/myLogFile.log')

s3.Object('test-kayd-bucket', 'object/myLogFile.log').upload_file('myLogFile.log')
s3.Bucket('test-kayd-bucket').upload_file(Filename='myLogFile.log',  Key='bucket/myLogFile.log1')

print("Process Finsihed --- %s seconds ---" %(time.time() - start_time))

Upvotes: 2

Related Questions