msi_gerva
msi_gerva

Reputation: 2078

How to make variable(s) global over different modules in Python?

I have a small project where I need to initially introduce and use some quite large number of variables.

Obviously, I can make some configure file, where I set up all the values of the variables. I have made just some Python file, where I give values:

value_a = 'something'
value_b = 'something'
value_c = 5.0

and call the file conf.py. When I do from conf import *, I have all the variables with values initialized.

Nevertheless, I have different modules in project with different subroutines (methods in it) and I want to have all those values from conf.py known in every method and in every module.

Obviously, I can do from conf import * in every module and/or import conf in every subroutine, but is it the best way how to implement the initialization of variables?

Upvotes: 1

Views: 109

Answers (3)

Chris Johnson
Chris Johnson

Reputation: 21946

I generally agree with @@Aaron. What he outlined is very general / portable and safe.

Since import * is an antipattern, you could easily do import config and then reference its values like config.varname.

I think it's fine to use .py files when needed. Aaron's point is good, but as long as the config is controlled by the person running the app, there's no security issue. The main reason to allow .py files is when some of the config items need to be derived from other config items, or looked up / loaded at run time. If there's no need for that (config is 100% flat and static) then .json or another flat file approach as Aaron mentioned would be best.

Upvotes: 1

In my opinion a solution that I would implement for that issue is to create a conf.py, as you said, and then define in it global variables as dictionary structures in order to be properly organized and easy-to-use on the modules that will be imported. For example:

globals = {'value_a': 'something a',
           'value_b': 'something b',
           'value_c': '5.0',
           'allowed_platforms': {
                          'windows': 'value 1',
                          'os x':    'value 2',
                          'linux':   'value 3'
                        },
           'hosts': ['host a', 'host b', 'host c'],
           ...
           }

You need to avoid the from some_module import * statement because you could put a lot of imports into the namespace and because it's not explicit about what is importing. So doing at the top of each module from your_package.conf import globals you could use it without the need of importing explicitly every single variable that you want to use or without importing the entire module. I prefer that solution, and also could be better if you use json files to store the info of that global variables and then read and serialize them in the conf.py module before being imported in your required modules.

Upvotes: 1

Aaron
Aaron

Reputation: 11075

Using a module as you describe is a viable way to setup configuration values for a script, but there are a few reasons you might be better off with something else.

  • A few others in the comments have pointed out that import * is frowned upon because it clutters up the root namespace with lots of variable names making it much easier to accidentally have name conflicts. Keeping them under the module name (ex: conf.varname) helps from an organizational standpoint in keeping track of names and preventing conflict.

  • If you plan to distribute code that requires configuration, using a .py module opens up your code to arbitrary code execution of anything that gets typed in that file. This is where things like ".ini .json .cfg etc" files are very useful. As an added bonus by using a common format (like json) it makes the configuration easy to port to other languages if a colleague is using a different language but needs to work on the same project. Off the top of my head, python includes libraries for .xml .json and .ini files.

Upvotes: 2

Related Questions