Homunculus Reticulli
Homunculus Reticulli

Reputation: 68436

Passing python array to bash script (and passing bash variable to python function)

I have written a Python module which contains functions that return arrays. I want to be able to access the string arrays returned from the python module, and iterate over in a bash script, so I may iterate over the array elements.

For example:

Python module (mymod)

def foo():
    return ('String', 'Tuple', 'From', 'Python' )

def foo1(numargs):
    return [x for x in range(numargs)]

Bash script

foo_array  = .... # obtain array from mymod.foo()
for i in "${foo_array[@]}"
do
    echo $i
done


foo1_array = .... # obtain array from mymod.foo1(pass arg count from bash)
for j in "${foo1_array[@]}"
do
    echo $j
done

How can I implement this in bash?.

version Info:

Python 2.6.5 bash: 4.1.5

Upvotes: 7

Views: 14570

Answers (5)

Dagorodir
Dagorodir

Reputation: 104

As well as Maria's method to obtain output from python, you can use the argparse library to input variables to python scripts from bash; there are tutorials and further docs here for python 3 and here for python 2.

An example python script command_line.py:

import argparse
import numpy as np

if __name__ == "__main__":

    parser = argparse.ArgumentParser()

    parser.add_argument('x', type=int)

    parser.add_argument('array')

    args = parser.parse_args()

    print(type(args.x))
    print(type(args.array))
    print(2 * args.x)

    str_array = args.array.split(',')
    print(args.x * np.array(str_array, dtype=int))

Then, from a terminal:

$ python3 command_line.py 2 0,1,2,3,4
# Output
<class 'int'>
<class 'str'>
4
[0 2 4 6 8]

Upvotes: 0

Keshav Patil
Keshav Patil

Reputation: 1

This helps too. script.py:

 a = ['String','Tuple','From','Python']

    for i in range(len(a)):

            print(a[i])

and then we make the following bash script pyth.sh

#!/bin/bash

python script.py > tempfile.txt
readarray a < tempfile.txt
rm tempfile.txt

for j in "${a[@]}"
do 
      echo $j
done

sh pyth.sh

Upvotes: 0

Yauhen Yakimovich
Yauhen Yakimovich

Reputation: 14211

In addition, you can tell python process to read STDIN with "-" as in

echo "print 'test'" | python -

Now you can define multiline snippets of python code and pass them into subshell

FOO=$( python - <<PYTHON

def foo():
    return ('String', 'Tuple', 'From', 'Python')

print ' '.join(foo())

PYTHON
)

for x in $FOO
do
    echo "$x"
done

You can also use env and set to list/pass environment and local variables from bash to python (into ".." strings).

Upvotes: 1

Maria Zverina
Maria Zverina

Reputation: 11173

Second try - this time shell takes the integration brunt.

Given foo.py containing this:

def foo():
        foo = ('String', 'Tuple', 'From', 'Python' )
        return foo

Then write your bash script as follows:

#!/bin/bash
FOO=`python -c 'from foo import *; print " ".join(foo())'`
for x in $FOO:
do
        echo "This is foo.sh: $x"
done

The remainder is first answer that drives integration from the Python end.

Python

import os
import subprocess

foo = ('String', 'Tuple', 'From', 'Python' )

os.putenv('FOO', ' '.join(foo))

subprocess.call('./foo.sh')

bash

#!/bin/bash
for x in $FOO
do
        echo "This is foo.sh: $x"
done

Upvotes: 14

Husman
Husman

Reputation: 6909

In lieu of something like object serialization, perhaps one way is to print a list of comma separated values and pipe them from the command line.

Then you can do something like:

> python script.py | sh shellscript.sh

Upvotes: 0

Related Questions