Reputation: 48
I have written my first Python script (yay) for performing a query against a Postgres database to: count the number of rows in a table, delete rows from that table based on date criteria, then count rows again to show that the deletion was indeed successful. For each of those portions, a file is opened, updated, and closed to record each task as it's completed.
A comment by a peer, however, has made me wonder if there's a better way to handle writes to the file instead of opening and closing it at each task. He made mention of python's logging
library, but in my inexperience I feel like I may be missing it's utility as it relates to what could improve writing to file.
Here's a snippet of the script showing what I'm doing presently:
cur = con.cursor()
cur.execute("SELECT COUNT(*) OVER() FROM test LIMIT 1")
rows = cur.fetchall()
for row in rows:
logFile = open('/var/log/test_script.log', 'a')
print("Pre-Delete Row Count =", row[0], file = logFile)
logFile.close()
Thoughts about how writing to file could be improved based on the snippet?
Upvotes: 1
Views: 61
Reputation: 309
Firstly, you can use the context manager syntax for file, secondly, you can put the open context outside of the for loop; see example:
cur = con.cursor()
cur.execute("SELECT COUNT(*) OVER() FROM test LIMIT 1")
rows = cur.fetchall()
with open('/var/log/test_script.log', 'a') as f:
for row in rows:
print("Pre-Delete Row Count =", row[0], file = f)
Upvotes: 1
Reputation: 27283
You can just move the opening of the file above the for loop:
cur = con.cursor()
cur.execute("SELECT COUNT(*) OVER() FROM test LIMIT 1")
rows = cur.fetchall()
with open('/var/log/test_script.log', 'a') as logFile:
for row in rows:
print("Pre-Delete Row Count =", row[0], file=logFile)
Note that you no longer need to close the file when using it as a context manager.
Upvotes: 1