Reputation: 9759
I want to change a couple of files at one time, iff I can write to all of them. I'm wondering if I somehow can combine the multiple open calls with the with
statement:
try:
with open('a', 'w') as a and open('b', 'w') as b:
do_something()
except IOError as e:
print 'Operation failed: %s' % e.strerror
If that's not possible, what would an elegant solution to this problem look like?
Upvotes: 974
Views: 529181
Reputation: 17
If you want to preserve the original line breaks, even if the file has been edited and saved across multiple different operating systems it may have line breaks of: \n (LN line feed) used in linux \r\n (CR carriage return, LN line feed) used in Microsoft Windows OS \r Used in older versions of Mac, and in Unix operating systems such as OpenBSD
with open('example.txt', 'rb') as f:
content = f.read()
content_str = content.decode('utf-8')
**content_str**
will have the original line breaks as they appear in the fileprint(repr(content_str)) # Use repr to show line breaks in the output
Upvotes: 1
Reputation: 602425
As of Python 2.7 (or 3.1 respectively) you can write
with open('a', 'w') as a, open('b', 'w') as b:
do_something()
(Historical note: In earlier versions of Python, you can sometimes use
contextlib.nested()
to nest context managers. This won't work as expected for opening multiples files, though -- see the linked documentation for details.)
In the rare case that you want to open a variable number of files all at the same time, you can use contextlib.ExitStack
, starting from Python version 3.3:
with ExitStack() as stack:
files = [stack.enter_context(open(fname)) for fname in filenames]
# Do something with "files"
Note that more commonly you want to process files sequentially rather than opening all of them at the same time, in particular if you have a variable number of files:
for fname in filenames:
with open(fname) as f:
# Process f
Upvotes: 1487
Reputation: 41208
From Python 3.10 there is a new feature of Parenthesized context managers, which permits syntax like:
with (
open("a", "w") as a,
open("b", "w") as b
):
do_something()
Upvotes: 73
Reputation: 99001
Late answer (8 yrs), but for someone looking to join multiple files into one, the following function may be of help:
def multi_open(_list):
out=""
for x in _list:
try:
with open(x) as f:
out+=f.read()
except:
pass
# print(f"Cannot open file {x}")
return(out)
fl = ["C:/bdlog.txt", "C:/Jts/tws.vmoptions", "C:/not.exist"]
print(multi_open(fl))
2018-10-23 19:18:11.361 PROFILE [Stop Drivers] [1ms]
2018-10-23 19:18:11.361 PROFILE [Parental uninit] [0ms]
...
# This file contains VM parameters for Trader Workstation.
# Each parameter should be defined in a separate line and the
...
Upvotes: 4
Reputation: 78780
Since Python 3.3, you can use the class ExitStack
from the contextlib
module to safely
open an arbitrary number of files.
It can manage a dynamic number of context-aware objects, which means that it will prove especially useful if you don't know how many files you are going to handle.
In fact, the canonical use-case that is mentioned in the documentation is managing a dynamic number of files.
with ExitStack() as stack:
files = [stack.enter_context(open(fname)) for fname in filenames]
# All opened files will automatically be closed at the end of
# the with statement, even if attempts to open files later
# in the list raise an exception
If you are interested in the details, here is a generic example in order to explain how ExitStack
operates:
from contextlib import ExitStack
class X:
num = 1
def __init__(self):
self.num = X.num
X.num += 1
def __repr__(self):
cls = type(self)
return '{cls.__name__}{self.num}'.format(cls=cls, self=self)
def __enter__(self):
print('enter {!r}'.format(self))
return self.num
def __exit__(self, exc_type, exc_value, traceback):
print('exit {!r}'.format(self))
return True
xs = [X() for _ in range(3)]
with ExitStack() as stack:
print(len(stack._exit_callbacks)) # number of callbacks called on exit
nums = [stack.enter_context(x) for x in xs]
print(len(stack._exit_callbacks))
print(len(stack._exit_callbacks))
print(nums)
Output:
0
enter X1
enter X2
enter X3
3
exit X3
exit X2
exit X1
0
[1, 2, 3]
Upvotes: 37
Reputation: 628
With python 2.6 It will not work, we have to use below way to open multiple files:
with open('a', 'w') as a:
with open('b', 'w') as b:
Upvotes: 6
Reputation: 5109
Nested with statements will do the same job, and in my opinion, are more straightforward to deal with.
Let's say you have inFile.txt, and want to write it into two outFile's simultaneously.
with open("inFile.txt", 'r') as fr:
with open("outFile1.txt", 'w') as fw1:
with open("outFile2.txt", 'w') as fw2:
for line in fr.readlines():
fw1.writelines(line)
fw2.writelines(line)
EDIT:
I don't understand the reason of the downvote. I tested my code before publishing my answer, and it works as desired: It writes to all of outFile's, just as the question asks. No duplicate writing or failing to write. So I am really curious to know why my answer is considered to be wrong, suboptimal or anything like that.
Upvotes: 31
Reputation: 10990
For opening many files at once or for long file paths, it may be useful to break things up over multiple lines. From the Python Style Guide as suggested by @Sven Marnach in comments to another answer:
with open('/path/to/InFile.ext', 'r') as file_1, \
open('/path/to/OutFile.ext', 'w') as file_2:
file_2.write(file_1.read())
Upvotes: 103
Reputation: 9068
Just replace and
with ,
and you're done:
try:
with open('a', 'w') as a, open('b', 'w') as b:
do_something()
except IOError as e:
print 'Operation failed: %s' % e.strerror
Upvotes: 134