Reputation: 546
I need to run a large build script (bash commands) on a python script. I receive it as a large string and each line is splitted by a \n. So, I need to execute each line separately.
At first, I tried to use subprocess.Popen() to execute them. But the problem is: after each line, the process terminates and all the environment variables are lost.
The problem is not to wait a command to finish to execute another, I need all of them to be executed on the same shell.
The only solution that I found so far is to save all those commands as a sh file (for example build.sh) and execute it on python.
I would not like to use this approuch because I want to have more control over each execution.
Is there any way to execute those commands on the same process, one by one?
Any other solution would be nice too.
Upvotes: 3
Views: 3118
Reputation: 136
What you want is definitely a little weird, but it's possible using pipes.
from subprocess import PIPE, Popen
p = Popen(['bash'], stdin=PIPE, stdout=PIPE)
p.stdin.write('echo hello world\n')
print(p.stdout.readline())
# Check a return code
p.stdin.write('echo $?\n')
if p.stdout.readline().strip() ⩵ '0':
print("Command succeeded")
p.stdin.write('echo bye world\n')
# Close input and wait for bash to exit
stdout, stderr = p.communicate()
print(stdout)
Upvotes: 4
Reputation: 372
when calling a shell, the os starts a new process unless you have a shell interpreter in python all the way.
the only possibility to do it in the same process is simulating all steps with python directly.
the better way is to accept the limit, call an external process yourself and wait for the script to terminate controlled. example was e.g. here: python to wait for shell command to complete
Upvotes: 0