Reputation: 8269
I have a simple Python script which live prints the output of commands. Following is the script.
#!/usr/bin/env python3
import subprocess
def RunSubProcess(process, process_args):
process = subprocess.Popen([process, process_args], stdout=subprocess.PIPE, stderr=subprocess.PIPE, universal_newlines=True)
while process.stdout.readable():
line = process.stdout.readline()
if not line:
break
print(line.strip())
# Below works for an example since it is a simple command with one argument
RunSubProcess('ls', '-la')
# But below does not work because of multiple args and double qoutes involved
RunSubProcess('grep', '-r "Some_String" .')
Question:
As you can see, I have a reusable method called RunSubProcess
which takes the command as the first parameter and the arguments to the command as second parameter.
RunSubProcess
works well for ls -la
which is a simple command. But RunSubProcess
fails when I want to call grep -r "Some_String" .
. I believe this subprocess.Popen
failing because there is more than one parameter involved in the command and especially "
s.
How can I pass process_args
properly by ensuring the double quotes and multiple arguments are taken care of?
Environment:
I am running this Python script on a bash shell on macos with Python 3.9.0 installed.
Upvotes: 1
Views: 1616
Reputation: 189948
Your function definition is weird. A more sane design would be to accept a list just like subprocess.Popen
and simply pass it on verbatim.
Tangentially, this looks like you should simply use subprocess.run()
instead. If you don't particularly care about the output, just run the process and let it write directly to standard output, without passing through Python at all.
subprocess.run(['ls', '-la'])
subprocess.run(['grep', '-r', 'SomeString', '.'])
If you do want the output to be visible to Python, use print(subprocess.check_output(['ls', '-la'], text=True))
or something along those lines.
Upvotes: 2