Reputation: 27969
I use arparse in Python to parse arguments from the command line:
def main():
parser = argparse.ArgumentParser(usage=usage)
parser.add_argument('-v', '--verbose', dest='verbose', action='store_true')
parser.add_argument(...)
args = parser.parse_args()
I use object args
only in few places of the code.
There are three methods and the call stack looks like this
def first_level(args):
second_level()
def second_level():
third_level()
def third_level():
### here I want to add some logging if args.verbose is True
I want to add some logging to third_level()
.
I don't like to change the signature of the method second_level()
.
How can I make the arg
object available in third_lelvel()
?
I could store arg
as global variable, but I was told to not use global variables in a developer training some years ago ....
What is common way to handle this?
Upvotes: 1
Views: 127
Reputation: 231510
A common module structure is something like:
imports ...
<constants>
options = {verbose=0, etc}
# alt options = argparse.Namespace(logging=False,....)
def levelone(args, **kwargs):
....
def leveltwo(...):
def levelthree(...):
<use constant>
<use options>
def parser():
p = argparse.ArgumentParser()
....
args = p.parse_args() # this uses sys.argv
if __name__=='__main__':
args = parser()
options.update(vars(args))
levelone(args)
The body of the module has function definitions, and can be imported by another module. If used as a script then parser
reads the commandline. That global options
is available for all sorts of state
like parameters. In sense they are are constants that the user, or importing module, can tweak. Values imported from a config
file can have the same role.
Another common pattern is to make your functions methods of a class, and pass args
as object attributes.
class Foo():
def __init__(self, logging=False):
self.logging = logging
def levelone():
def leveltwo():
<use self.logging>
foo(args.logging).levelone()
While globals
are discouraged, it's more because they get overused, and spoil the modularity that functions provide. But Python also provides a module
level namespace that can contain more than just functions and classes. And any function defined in the module can access that namespace - unless its own definitions shadow it.
var1 = 'module level variable'
var2 = 'another'
def foo(var3):
x = var1 # read/use var1
var2 = 1 # shadow the module level definition
etc
================
I'm not sure I should recommend this or not, but you could parse sys.argv
within third_level
.
def third_level():
import argparse
p = argparse.ArgumentParser()
p.add_argument('-v','--verbose',action='count')
args = p.parse_known_args()
verbose = args.verbose
<logging>
argparse
imports sys
and uses sys.argv
. It can do that regardless of whether it is use at your script level, in your main
or some nested function. logging
does the same sort of thing. You could use your own imported module to covertly pass values into functions. Obviously that can be abused. A class
with class attributes can also be used this way.
Upvotes: 2
Reputation: 44444
Converting my comment to answer. I'd suggest not condition in your third_level(..)
at all. There are mechanisms to let the logging module take care of that -- and those mechanisms can be controlled from outside of those 3 functions.
Something like:
def first_level(args):
second_level()
def second_level():
third_level()
def third_level():
logging.info("log line which will be printed if logging is at INFO level")
def main():
args = ....
#Set the logging level, conditionally
if args.verbose:
logging.basicConfig(filename='myapp.log', level=logging.INFO)
else:
logging.basicConfig(filename='myapp.log', level=logging.WARNING)
first_level(args)
Upvotes: 2