Reputation: 39
Hey i writing a program that´s supposed to searches through .log files and find a keyword "Complete Respons". All found keywords should later be writen and saved in a new .txt file. I have now managed the program to search through one document at the time, but i have +50 documents of the same typ in one directory that i want to search through all at the same time and put all found keyword in the same .txt document. I could really use some help...! Thanks
def read_log_file(filename, keyword): #file
saved_word = [] # Array
# read file
with open(filename) as file_search: #open search file
file_search = file_search.readlines() #read file
for lines in file_search: # every word is scaned
if keyword in lines: # extract the keyword
saved_word.append(lines) #store all found keywords in array
# write in new file
with open('CompleteResponse.txt', 'w') as file_handler:
file_handler.write(f"{filename}\n")
for i in range(len(saved_word)):
file_handler.write(f"{saved_word[i]}")
print('done') # completed
print(len(saved_word)) # count found words
read_log_file(r'C:\Users\\Documents\read_log_files\test.log', 'Complete Response:')
Upvotes: 0
Views: 1692
Reputation: 114786
Open your output file 'CompleteResponse.txt'
in append mode, instead of write:
with open('CompleteResponse.txt', 'a') as file_handler:
Then call your function in a loop:
import glob
for filename in glob.glob('C:\Users\\Documents\read_log_files\*.log'):
read_log_file(filename, 'Complete Response:')
Should do the trick for you.
You can find a detailed list of file opening modes here.
PS,
If you intend to call this function several times, the output file 'CompleteResponse.txt'
will contain ALL results (concatenation of ALL outputs from ALL runs).
To avoid this, you might want to "reset" the file before processing all log files:
with open('CompleteResponse.txt', 'w') as file_handler:
pass # open with 'w' to "reset" the file.
Upvotes: 3