Jimmy C
Jimmy C

Reputation: 9660

Treat multiple text files as one big file

I have a python program that takes as input a text file (or rather, a File object). I have multiple large text files that I would like to treat like one concatenated file, using as input. Unfortunately, because of space limitations, it would be difficult to me to create the concatenated file explicitly, hence I'm looking for an elegant way to 'trick' the program into thinking it is just one file after all. That is, when it is done iterating all rows in one file, I would like it to start on the next one, and so on. Suggestions?

Upvotes: 2

Views: 126

Answers (2)

m.wasowski
m.wasowski

Reputation: 6387

from itertools import chain

with open('foo') as foo, open('bar') as bar:
    for line in chain(foo, bar):
        print line,

Upvotes: 1

RemcoGerlich
RemcoGerlich

Reputation: 31250

The fileinput module does this, docs.

If your filenames are all given on the command line, then it's as simple as:

import fileinput

for line in fileinput.input():
    process(line)

Otherwise, just give the filenames in a list as the first parameter of input.

Upvotes: 6

Related Questions