Reputation: 962
I've a python file, say A.py which depends upon several other packages/folders/files let's call them B.py, C.py, D.py etc. and these are residing in some different location than A.py
A.py contents::
import B, C, D
//other codes
and I am calling A.py from another python script, let's call it 1.py
1.py contents:
child_process = subprocess.Popen("python A.py", shell=True)
And I run 1.py like this in command line: python 1.py
my question is how to pass a series of dependencies, in this case, B.py, C.py, D.py to A.py in the child process to run it successfully.
I am using python 2.7
Upvotes: 0
Views: 1287
Reputation: 2479
You should correctly install B
/C
/D
files in the python path.
Alternatively you can use the PYTHONPATH
environmental variable:
child_process = subprocess.Popen(['python', 'A.py'], env={'PYTHONPATH': '/path/to/the/directory'})
Where /path/to/the/directory
is the path to the directory containing B
, C
and D
.
Depending on what A.py
does you may need to pass also some more of the environment. This can be achieved by either:
setting the env variable in the parent process and let the child process inherit it:
import os
os.environ['PYTHONPATH'] = '/path/...'
Popen(['python', 'A.py']) # by default it inherits parent env
or copy the parent environment:
new_env = dict(os.environ) # make a copy
new_env['PYTHONPATH'] = '/path/...'
Popen(['python', 'A.py', env=new_env)
Note: using shell=True
is a security hazard. Also it is useless and less efficient. you should avoid it and pass the command line as a list of strings.
Upvotes: 2