Reputation: 13529
I have a programme running on an old laptop which is constantly monitoring a Dropbox folder for new files being added. When it's running the Python process uses close to 50% of the CPU on a dual-core machine, and about 12% on an 8-core machine which suggests it's using close to 100% of one core). This is giving off a lot of heat.
The relevant bit of code is:
while True:
files = dict ([(f, None) for f in os.listdir(path_to_watch)])
if len(files) > 0:
print "You have %s new file/s!" % len(files)
time.sleep(20)
In the case that there is no new file, surely most of the time should be spent in the time.sleep()
waiting which I wouldn't have thought would be CPU-intensive - and the answers here seem to say it shouldn't be.
So two questions:
1) Since time.sleep()
shouldn't be so CPU-intensive, what is going on here?
2) Is there another way of monitoring a folder for changes which would run cooler?
Upvotes: 1
Views: 404
Reputation: 705
There is also a cross platform API to monitor file system changes: Watchdog
Upvotes: 2
Reputation: 6448
1) Your sleep only gets called when there are new files.
This should be much better:
while True:
files = dict ([(f, None) for f in os.listdir(path_to_watch)])
if len(files) > 0:
print "You have %s new file/s!" % len(files)
time.sleep(20)
2) Yes, especially if using linux. Gamin would be something I'd recommend looking into.
Example:
import gamin
import time
mydir = /path/to/watch
def callback(path, event):
global mydir
try:
if event == gamin.GAMCreated:
print "New file detected: %s" % (path)
fullname = mydir + "/" + path
print "Goint to read",fullname
data = open(fullname).read()
print "Going to upload",fullname
rez = upload_file(data,path)
print "Response from uploading was",rez
except Exception,e: #Not good practice
print e
import pdb
pdb.set_trace()
mon = gamin.WatchMonitor()
mon.watch_directory(mydir, callback)
time.sleep(1)
while True:
ret = mon.handle_one_event()
mon.stop_watch(mydir)
del mon
Upvotes: 3