Somebody Someone
Somebody Someone

Reputation: 15

Possibility to process a list of commands with subprocess.Popen for python3.7

I want to write content from curl command applied to multiple urls to one file, e.g. I have

 days = [f'from-{x+1}d-to-{x}'for x in range(5, 0, -1)]

 urls = [f'https:\\example.com?{day}/data' for day in days]

 command = [f'curl {url}' for url in urls]

 command

['curl https:\\example.com?from-6d-to-5/data', 'curl https:\\example.com?from-5d-to-4/data', 'curl https:\\example.com?from-4d-to-3/data', 'curl https:\\example.com?from-3d-to-2/data', 'curl https:\\example.com?from-2d-to-1/data']

And I'm trying to write it all to just one file:

content = subprocess.Popen(([x for x in command]), shell = True, text = True, stdout = subprocess.PIPE).communicate()
 file_name = open('file_1', 'a') 
 file_name.write(str(content))

but looks like subprocess.Popen executes only the first curl command as I can see only one output in the console:

% Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 100 12591 0 12591 0 0 20931 0 --:--:-- --:--:-- --:--:-- 20915

Is there a way to execute multiple commands with subprocess.Popen? I suppose that there should be the same amount of console outputs as the amount of urls to be curled

Upvotes: 0

Views: 320

Answers (2)

Kris
Kris

Reputation: 8868

From the official documentation:

subprocess.Popen(args, bufsize=-1, executable=None, stdin=None, stdout=None, stderr=None, preexec_fn=None, close_fds=True, shell=False, cwd=None, env=None, universal_newlines=None, startupinfo=None, creationflags=0, restore_signals=True, start_new_session=False, pass_fds=(), *, encoding=None, errors=None, text=None)

Execute a child program in a new process. On POSIX, the class uses os.execvp()-like behavior to execute the child program. On Windows, the class uses the Windows CreateProcess() function. The arguments to Popen are as follows.

args should be a sequence of program arguments or else a single string or path-like object. By default, the program to execute is the first item in args if args is a sequence. If args is a string, the interpretation is platform-dependent and described below. See the shell and executable arguments for additional differences from the default behavior. Unless otherwise stated, it is recommended to pass args as a sequence.

Popen() expects only one child process to be created. So , the first command is considered, and others might be considered as extra arguments.

As answered by @Maurice , you can use the urllib to get the URL response. If you still want to use subprocess for this purpose, then there may be changes required like

responses = [str(subprocess.Popen(x.split(" "), shell = True, text = True, stdout = subprocess.PIPE).communicate()) for x in commands]

file_name = open('file_1', 'a')
file_name.writelines(responses)

This might not be a good option if you have lot of URLs to process.

Upvotes: 1

Maurice Meyer
Maurice Meyer

Reputation: 18106

You can use urllib directly within Python, usually there is no need to use subprocess/curl:

import urllib.request

days = [f'from-{x+1}d-to-{x}'for x in range(5, 0, -1)]
urls = [f'https://example.com?{day}/data' for day in days]

for url in urls:
    with urllib.request.urlopen(url) as response:
       print(response.read())

Upvotes: 1

Related Questions