Reputation: 720
As a pet project, I want to build something similar to the Jupyter notebook. Given an array of strings, each of which is a peace of python code, I would like to run each piece one by one in a single python process and then associate blocks of output with each piece of code. I would also like to manage it all in another (parent) python process.
To make the problem tangible, let's say I have a list of strings, each is a piece of python code. One string use variables from the preceding piece of code, i.e. they should all be run in a single process. Now I want to run one piece of code, wait until it finishes, capture the output, then run the next piece, and so on.
Unfortunately, googling around only gave me an example, where I can run peace of code, using subprocess.Popen('python', stdout=PIPE, ...)
, but with this approach, it will start executing my command after I close stdin, effectively closing the whole python process.
Upvotes: 0
Views: 857
Reputation: 4586
You can use contextlib.redirect_stdout
from the standard library to capture the output of exec()
calls. With that, your idea of code blocks (as I understand them) is straightforward to implement:
import io
from contextlib import redirect_stdout
class Block:
def __init__(self, code=''):
self.code = code
self.stdout = io.StringIO()
def run(self):
with redirect_stdout(self.stdout):
exec(self.code, globals()) # Pass global variable dict to allow modification
@property
def output(self):
return self.stdout.getvalue()
>>> b1 = Block('a = 42; print(a)')
>>> b2 = Block('print(1/a)')
>>> b1.run()
>>> b2.run()
>>> b1.output
'42\n'
>>> b2.output
'0.023809523809523808\n'
Upvotes: 1