wanonoui
wanonoui

Reputation: 11

How i can read big text file line by line and working with multiprocessing

I want to ask if there's solution with large text file to read line by line

keyword = open('./list.txt', 'r', encoding="utf-8").read().splitlines()

   with Pool(processes=2) as executor:
    executor.map(verify, keyword)

just i want to know How i can read line by line and process it to executor i dont find solution that its work cause im new to working with multiprocessing i tried with this code

file1 = open('./list.txt', 'r', encoding="utf-8")
Lines = file1.readlines()
    def keyword():
      for line in Lines:
        return line

but i got this error

Traceback (most recent call last):
  File "main.py", line 40, in <module>
    executor.map(verify, keyword)
  File "/nix/store/2vm88xw7513h9pyjyafw32cps51b0ia1-python3-3.8.12/lib/python3.8/multiprocessing/pool.py", line 364, in map
    return self._map_async(func, iterable, mapstar, chunksize).get()
  File "/nix/store/2vm88xw7513h9pyjyafw32cps51b0ia1-python3-3.8.12/lib/python3.8/multiprocessing/pool.py", line 475, in _map_async
    iterable = list(iterable)
TypeError: 'function' object is not iterable

Upvotes: 1

Views: 52

Answers (1)

JP Marcel
JP Marcel

Reputation: 56

Your error is here :

def keyword():

So when you call executor.map(verify, keyword), map is trying to iterate over keyword which is a function.

You just need to replace :

def keyword():

By :

def verify(line):

verify needs at least one parameter since map will iterate over keyword and will call verify on each line.

Upvotes: 2

Related Questions