Andrea Fieschi
Andrea Fieschi

Reputation: 13

iPython, save the output of a code cell and still normally showing the ouput during execution

So, the usual answer to this question is using the cell magic "%%caputre cap", the problem is that it suppresses the normal output and to show it you have to run "cap.show()" after the execution of the cell. When running a cell that takes a long time, like training a NN, this feature becomes a pain in the neck. How can I run my code cell and have the real time output as usual and than be able to save it to a .txt file after?

Upvotes: 1

Views: 263

Answers (1)

paxton4416
paxton4416

Reputation: 565

This isn't IPython/Jupyter-specific, but here's a context manager I wrote for a similar purpose:

import sys


class capture_stdout:
    """
    Context manager similar to `contextlib.redirect_stdout`, but
    different in that it:
      - temporarily writes stdout to other streams *in addition to*
        rather than *instead of* `sys.stdout`
      - accepts any number of streams and sends stdout to each
      - can optionally keep streams open after exiting context by
        passing `closing=False`

    Parameters
    ----------
    *streams : *`io.IOBase`
        stream(s) to receive data sent to `sys.stdout`

    closing : `bool`, optional
        if [default: `True`], close streams upon exiting the context
        block.
    """
    def __init__(self, *streams, closing=True):
        self.streams = streams
        self.closing = closing
        self.sys_stdout_write = sys.stdout.write

    def __enter__(self):
        sys.stdout.write = self._write
        if len(self.streams) == 1:
            return self.streams[0]
        return self.streams

    def __exit__(self, exc_type, exc_value, traceback):
        sys.stdout.write = self.sys_stdout_write
        if self.closing:
            for s in self.streams:
                s.close()

    def _write(self, data):
        for s in self.streams:
            s.write(data)
        self.sys_stdout_write(data)
        sys.stdout.flush()

You can pass however many streams you want, and it'll write to all of them in addition to regular sys.stdout in real time. So for example if you wanted to show the live output, capture it for use later in your code, and also log it to a file, you could do:

from io import StringIO


with capture_stdout(StringIO(), open('logfile.txt', 'w')) as (mem_stream, file_stream):
    print('some really long output')
    # etc...
    stdout = mem_stream.getvalue()
    

print(f"in str: {stdout}")
with open('logfile.txt', 'r') as f:
    print(f"in file: {f.read()}")
some really long output

in str: some really long output

in file: some really long output

It also shouldn't be too hard to turn into a cell magic that works the same way. Here's the IPython docs section on defining custom magics if you want to give it a shot.

Upvotes: 1

Related Questions