H20
H20

Reputation: 115

Trouble with duplicate lines of logs in log file

I've been trying to get this working correctly all day, its nearly complete just a strange issue I'm getting. Every result that is found in the search query gets logged as expected but the first result gets logged once, the second gets logged twice, the third gets logged three times etc.

Any ideas how to get rid of the duplicates? Example of logs

#!/usr/bin/python
import urllib
import simplejson 
import logging
from logging.handlers import SysLogHandler

query = urllib.urlencode({'q' : 'test'})
url = 'http://ajax.googleapis.com/ajax/services/search/web?v=1.0&%s' \
      % (query)
search_results = urllib.urlopen(url)
json = simplejson.loads(search_results.read())
results = json['responseData']['results']
for i in results:
    logger = logging.getLogger()
    logger.addHandler(SysLogHandler(address=('192.168.0.2', 514)))
    logger.addHandler(logging.FileHandler("hits.log"))
    logging.warn(i['url'])
    print i['url']

Upvotes: 5

Views: 2807

Answers (3)

Marko Trajkov
Marko Trajkov

Reputation: 101

I had similar problem, but I needed to add a new handler each time in the for loop. So removing handler inside a loop didn't help me.

When you create handler like this:

hdl = logging.FileHandler("hits.log")

you need to remove it like this:

logger.removeHandler(hdl)

Upvotes: 5

shadyabhi
shadyabhi

Reputation: 17234

As you have not accepted the answer, as Daniel said, you need to have

logger = logging.getLogger('')
logger.addHandler(logging.FileHandler("hits.log"))
logger.addHandler(SysLogHandler(address=('192.168.0.2', 514)))

outside the for loop.

Upvotes: 4

Daniel Roseman
Daniel Roseman

Reputation: 599600

Because you're adding a new handler each time in the for loop. Do this outside the loop, then only do the actual logging.warn inside the loop.

Upvotes: 5

Related Questions