cnmcferren
cnmcferren

Reputation: 170

How do i run python script on all files in a directory?

import pyfits

file = input('Enter file name as string: ')

newfile = file.replace('.fits','.dat')

f = pyfits.getdata(file)

h = pyfits.getheader(file)

g = open(newfile,'w')

ref = h['JD_REF']

for x in range (len(f)):
        deltat = (f[x][0]-ref)/86400
        refday = ref/86400
        time = refday + deltat
        mag = f[x][3]
        err = f[x][4]
        flag = f[x][8]
        g.write('%9s %9s %10s %2s\n'% (time,mag,err,flag))


g.close()

I am trying to convert a .fits file containing a table of data into a .dat file. This program performs the necessary functions that I want, but it only performs the action one at a time. I have around 300 .fits files that need to be converted and doing them all by typing in the name each time would be painful. I would like it to where I can just select to run the program and it would cycle through and perform the script on all files in my directory.

How can I modify this program to run on all of the files that are in the current directory?

Upvotes: 1

Views: 634

Answers (1)

bnaecker
bnaecker

Reputation: 6430

Use os.listdir(). If you only want to return the files that match the '.fits' pattern, you can use glob.glob('*.fits').

This will return all files in the current directory, or whatever directory you supply it as an argument. You can loop over these names, and use those as your input files. Something like:

for f in os.listdir():
    newfile = f.replace('.fits', '.dat')
    # all the rest of your code

Upvotes: 2

Related Questions