Gufis
Gufis

Reputation: 21

Create and use multiple variables in a loop in Python

I'm trying to write a loop in Python that creates variables and also uses created an not-yet-created variables to produce a system of linear equations. This is what I've got so far:

P=[[0, 0, 0, 0.5, 0, 0.5],
[0.1, 0.1, 0, 0.4, 0, 0.4],
[0, 0.2, 0.2, 0.3, 0, 0.3],
[0, 0, 0.3, 0.5, 0, 0.2],
[0, 0, 0, 0.4, 0.6, 0],
[0, 0, 0, 0, 0.4, 0.6]]

for j in range(6):
    globals()['x%s' % j] = 0
    for i in range(6):
        globals()['x%s' % j] += P[i][j]*globals()['x%s' % i]

but I get a KeyError output because of the last line. I guess it's because the second variable ( globals()['x%s' % i]) is not defined yet but I don't know how to use the variables to the right of the equals sign when they are going to be defined later.

Upvotes: 1

Views: 283

Answers (1)

Erik Kaplun
Erik Kaplun

Reputation: 38257

First of all, do not use globals() for this (or in fact almost never); create your own dictionary to store your dynamically created values:

vars = {}
vars['blabla'] = 123

The reason you are getting the KeyError though is that you're trying to read a variable from globals() that doesn't exist (but this would also happen with a custom dict). This line:

globals()['x%s' % j] += P[i][j]*globals()['x%s' % i]

is actually short for:

globals()['x%s' % j] = globals()['x%s' % j] + P[i][j]*globals()['x%s' % i]

but globals()['x%s' % j] is not defined yet. So you're trying to add something to something that doesn't exist.

Instead, you need to do something like the following before you do the += operation:

if 'x%s' % j not in globals():
    globals()['x%s' % j] = 0

BUT, if you do it properly and use a dict, and some other enhancements, it would look like this:

# it's nicer to use floats uniformly, not a mix of ints and floats;
# it's also safer because in Python 2.x, int divided by float will yield a floored down int
P = [[0.0, 0.0, 0.0, 0.5, 0.0, 0.5],
     [0.1, 0.1, 0.0, 0.4, 0.0, 0.4],
     [0.0, 0.2, 0.2, 0.3, 0.0, 0.3],
     [0.0, 0.0, 0.3, 0.5, 0.0, 0.2],
     [0.0, 0.0, 0.0, 0.4, 0.6, 0.0],
     [0.0, 0.0, 0.0, 0.0, 0.4, 0.6]]

vars = {}

for j in range(len(P)):
    vars['x%s' % j] = 0
    for i in range(len(P[0])):
        vars.setdefault('x%s' % j, 0.0)  # shorthand for the `if` I described
        vars['x%s' % j] += P[i][j] * vars['x%s' % i]

Also, since vars seems to end up just containing the keys x0 to x5, I would rather use a list instead;

X = [0] * len(P)  # creates [0, 0, 0, 0, 0, 0]
for j in range(len(P)):
    for i in range(len(P[0])):
        X[j] += P[i][j] * X[i]

This way, you don't even need the setdefault call.


Keep in mind, though, that I'm not familiar with the mathematical algorithm/computation you're trying to implement, so I'm also not able to spot any non-technical bugs or shortcomings in your code.

Upvotes: 1

Related Questions