Reputation: 137
I have a python script that runs on input files in the same directory as the script using sys.argv:
def main():
input_1 = sys.argv[1]
input_2 = sys.argv[2]
output_file = sys.argv[3]
...
How can I get this script to run multiple times over each directory in a structure like this? I want it to be run over each data folder to grab type 1 and type 2 data to use in the script for x amount of times where x is number of data folders. Would I use bash, another python script, something else? I've never scripted for another script so anything helps. Thanks
>folder
>my_python_script.py
>data
>data_1
>type_1
data_type_1.txt
>type_2
data_type_1.txt
>data_2
>type_1
data_type_1.txt
>type_2
data_type_1.txt
EDIT: The script also writes the two input files to an output file. This output file is created in the same directory as the script and the output name will change based on which directory the script is running on to not keep overwriting the same output file. So if it runs on 3 directories, there will be 3 different outputs.
Using Windows
I run the script in cmd like so:
python script.py 'input_1.txt' 'input_2.txt' 'out.txt'
Upvotes: 0
Views: 462
Reputation: 40688
In bash, I would use the find
command to look for all data_? directories, then loop over them and call the python script:
SCRIPT=$PWD/my_python_script.py
for data_dir in $(find . -name 'data_?')
do
cd $data_dir
$SCRIPT
cd - > /dev/null # Returns to top dir
done
SCRIPT
variable points to the absolute path of the script, so it will be found regardless of current directory.I don't fully understand your question, so I am guessing that you want to pass data_1/data_type_1.txt as the first parameter and data_2/data_type_1.txt as second parameter:
SCRIPT=$PWD/my_python_script.py
for data_dir in $(find . -name 'data_?')
do
cd $data_dir
$SCRIPT data_1/data_type_1.txt data_2/data_type_2.txt
cd - > /dev/null # Returns to top dir
done
As for the third parameter, I cannot guess.
Upvotes: 1