sat_s
sat_s

Reputation: 277

Modify python script to run on every file in a directory

so I have a python script which takes the filename as a command argument and processes that file. However, because I have 263 files which need the same processing, I was wondering whether the command argument section could be modified with a for loop to consecutively run through all the files in a folder? Cheers, Sat

EDIT:

The code for the system argument is here:

try:
    opt_list, args = getopt.getopt(sys.argv[1:], 'r:vo:A:Cp:U:eM:')

except getopt.GetoptError, msg:
    print 'prepare_receptor4.py: %s' %msg
    usage()
    sys.exit(2)

with 'r' being the name of the file needing to be processed and the others are optional arguments. I'm not sure how to modify this with a for loop.

Upvotes: 9

Views: 31442

Answers (5)

johntellsall
johntellsall

Reputation: 15170

I suggest your 'main' should process each file given after the options. That is, in the "args" variable. Don't pass paths in with "-r ", this limits your flexibility. If you use os.walk() etc in the program you're requiring the system to work only on trees of files, which makes it more difficult to customize and develop.

If the program works with a list of paths, it's very easy to use in different ways. For example, you can list one data file for testing. To process a directory do "myprogram dir/*.dat". To process a tree of files use backquotes:

myprogram `find . -name "*.dat"`

Lastly you can do very cheap parallel processing. Something like:

find . -name '*.dat' | xargs -P 5 myprogram

Five copies of your program are run in parallel. No locking or forks or threads or other synchronization necessary.

(Above assumes you're on a Linux/OSX type system.)

Upvotes: 2

joel
joel

Reputation: 527

When I am working on multiple files/folders, I usually use os.walk:

import os
for root, dirs, files in os.walk(dir):
   for fname in files:
       do_something(fname) 

Get your directory from getopt or optparse. Also, you can build up absolute path with the os.path.abspath if you need it.

current_file = "%s%s%s" % (os.path.abspath(root), os.path.sep, fname)
do_something(current_file)

Upvotes: 5

VMDX
VMDX

Reputation: 685

os.walk() sounds like it might work here.

def traverse_and_touch(directory, touch):
  '''
  General function for traversing a local directory. Walks through
  the entire directory, and touches all files with a specified function.
  '''
  for root, dirs, files in os.walk(directory):
    for filename in files:
      touch(os.path.join(root, filename))
  return

Now, all you need to do is pass in the directory you'd like to traverse and a function and it'll perform the code on every file.

os.walk() also traverses all sub-directories.

Upvotes: 4

David Z
David Z

Reputation: 131600

As a practical matter, whatever shell you're using probably has some syntax that can be easily used for this. In Bash, for example:

for f in *; do python myscript.py $f; done

To actually do this in Python, I'd suggest structuring your program so that the main code is in a function which takes one argument, the filename.

def process(filename):
    ...code goes here...

Then you can invoke this function like so,

for f in os.listdir(folder):
    process(f)

folder could be passed as a command-line argument, or just written into the script (if it's not something you'd be reusing).

EDIT: In response to your edit, I'd suggest just giving the filenames as regular command-line arguments, without using the -r option, so that they'll wind up in args. Then you can do

for f in args:
    process(f)

or if you would rather pass the directory name as the command-line argument,

for d in args:
    for f in os.listdir(d):
        process(f)

Alternatively, I suppose you could pass multiple instances of the -r option, and then do

for opt, arg in opt_list:
    if opt == '-r':
        process(arg)

Upvotes: 15

Daenyth
Daenyth

Reputation: 37441

Yes, you could modify it like that. Loop through the arguments rather than indexing the first element.

Upvotes: 1

Related Questions