clonelab
clonelab

Reputation: 651

python 3.4 multiprocessing

This question is asking for advice as well as assistance with some code.

I currently am learning Python with 3.4 I have built a basic network checking tool, i import items from a text file and for each of them i want python to check dns (using pydns), ping the ip (using subprocess to call OS native ping).

Currently i am checking 5000 to 9000 thousand IP address and its taking a number of hours, approx 4 to return all the results.

I am wondering if i can use multiprocessing or threading to speed this up but still the return the output to a list so that the row can be written to a csv file at the very end of the script in bulk.

I am new to python so please tell me if i have overlooked something i should of also.

Main code http://pastebin.com/ZS23XrdE

Class http://pastebin.com/kh65hYhG

Upvotes: 4

Views: 551

Answers (2)

noxdafox
noxdafox

Reputation: 15020

As most of the work seems IO based, you can easily rely on Threads.

Take a look at the Executor.map() function in cocurrent.futures: https://docs.python.org/3/library/concurrent.futures.html

You can pass the list of IPs and the function you want to run against each element, the returned value, virtually, is the list of results of the given function.

In your specific case you can wrap the two worker's methods (check_dns_ip and os_ping) in a single one and pass it to the ThreadPoolExecutor.map function.

Upvotes: 0

Related Questions