Reputation: 11
I use this code
import os
import hashlib
with open('plain_text.txt') as f: # Plain text
for line in f.readlines():
line = line.rstrip("\n")
m = hashlib.sha256((line).encode('utf-8'))
os.system(f'echo {m.hexdigest()} >> sha256_file.txt') # Hexadecimal file
I want to accelerate the process of hash each line in file text or if any other method to make the same process faster
Find another method to make the same process faster.
Upvotes: -2
Views: 94
Reputation: 55888
Right now, you are spawning a new shell process to append the SHA256 hashes to your sha256_file.txt
for each individual hash. Depending on the length of each line, this will likely take much longer than the hash calculation itself.
Thus, you should instead write to the file directly from Python, e.g. as follows:
import os
import hashlib
with open('plain_text.txt', 'r') as f:
with open('sha256_file.txt', 'a') as sha256_file:
for line in f:
line = line.rstrip("\n")
hexdigest = hashlib.sha256(line.encode('utf-8')).hexdigest()
print(hexdigest, file=sha256_file)
Upvotes: 0