user1998665
user1998665

Reputation: 81

I/O optimization for Python

I'm writing a program that takes in multiple lines of codes. At the moment, I am processing each one separately -> as each line comes in, I strip it, store it, etc., before asking for the next line via the sys.stdin.readline() method. I was wondering if there was a way to improve my efficiency, as my program is currently too slow. Is it faster to just take all the lines at once(I know how many lines I am expecting), store them in a list and then process them? If so, is there any one built in function that is built for speed and can do this efficiently?

Thanks

Upvotes: 0

Views: 1510

Answers (2)

martineau
martineau

Reputation: 123393

You could speed things up a bit by reading all the lines in at once and then stripping and putting them into a list like this:

import sys
lines = sys.stdin.read().splitlines(False)
if lines:
    # process lines list....

Upvotes: 1

Tauchiss
Tauchiss

Reputation: 1

The readlines() method will read the file, and automatically split it into a list broken by newline characters. You can then apply the rstrip('\r\n') method iteratively to the list. Alternatively, if you know the newline character (normally '\n' for *nix), you can chain the read and split methods like this: .read().split('\n').

Sample code:

file_ob = open(path/to/file,'r')
# Method 1
file_list = file_obj.readlines()
for f in file_list
    clean = f.rstrip('\r\n')
    #do something with clean
# Method 2
clean_list = file_obj.read().split('\r')
# do something with clean_list

Upvotes: 0

Related Questions