jim jarnac
jim jarnac

Reputation: 5152

Checking if script is already running (python / linux)

I am trying to add a function to a script to check if it is already running. This is because the script will be started with a cronjob.

Here a stub of what i attempted for that function:

import psutil
import sys
import time


print(__file__)


def check_if_running():
    # print('process_nb: ', len(list(psutil.process_iter())))
    for i, q in enumerate(psutil.process_iter()):
        n = q.name() 
        # print(i, n)
        if 'python' in n:
            print(i, n)
            c = q.cmdline() 
            print(c)
            if __file__ in c:
                print('already running')
                sys.exit()
            else:
                print('not yet running')
                return 


if __name__ == '__main__':
    check_if_running()
    while True:
        time.sleep(3)

I run the script a first time, then a second in a separate shell. On the second time it should print 'already running' and exit, however it doesn't.

Can anyone help me figure out why ?

Upvotes: 1

Views: 1415

Answers (2)

alani
alani

Reputation: 13079

Here is a possible alternative - this wrapper based on Linux file locking can be added at the start of the command line in your cron job, and then no checks are needed inside your script itself.

Then in the crontab, just use this command:

/path/to/lock_wrapper --abort /path/to/lock_file your_command [your_command_args...] 

Ensure that the lockfile is on a local filesystem for proper file locking functionality. (Some types of shared filesystem do not work reliably with file locks.)

If the file is already locked, then it will abort. Without --abort, it would wait instead.

#!/usr/bin/env python3

"""
   a wrapper to run a command with a lock file so that if multiple
   commands are invoked with the same lockfile, they will only run one
   at a time, i.e. when it's running it applies an exclusive lock to the
   lockfile, and if another process already has the exclusive lock then
   it has to wait for the other instance to release the lock before it
   starts to run, or optionally the second process will simply abort

   can be used for running instances of commands that are
   resource-intensive or will in some other way conflict with each
   other
"""

import sys
import os
import fcntl
import subprocess
from argparse import ArgumentParser


def parse_args():
    parser = ArgumentParser(__doc__)
    parser.add_argument(
        "-a", "--abort",
        action="store_true",
        help="abort if the lockfile is already locked (rather than waiting)")
    parser.add_argument("lockfile",
                        help=("path name of lockfile "
                              "(will be created if it does not exist)"))
    parser.add_argument("command",
                        nargs="*",
                        help="command (with any arguments)")
    return parser.parse_args()


def ensure_exists(filename):
    if not os.path.exists(filename):
        with open(filename, "w"):
            pass


def lock(fh, wait=True):
    if wait:
        fcntl.flock(fh, fcntl.LOCK_EX)
    else:
        try:
            fcntl.flock(fh, fcntl.LOCK_EX | fcntl.LOCK_NB)
        except IOError:
            sys.exit(1)


def unlock(fh):
    fcntl.flock(fh, fcntl.LOCK_UN)    


args = parse_args()
ensure_exists(args.lockfile)
with open(args.lockfile) as fh:
    lock(fh, wait=not args.abort)    
    with subprocess.Popen(args.command) as proc:
        return_code = proc.wait()
    unlock(fh)
sys.exit(return_code)

Upvotes: 0

khelwood
khelwood

Reputation: 59146

As @JohnGordon noticed in the comments, there is a logic problem in your code.

if __file__ in c:
    print('already running')
    sys.exit()
else:
    print('not yet running')
    return

Here, if it checks a process and it doesn't match the file, the function returns. That means it won't check any remaining processes.

You can only deduce that the program is not yet running after the loop has been allowed to complete.

def check_if_running():
    # print('process_nb: ', len(list(psutil.process_iter())))
    for i, q in enumerate(psutil.process_iter()):
        n = q.name() 
        # print(i, n)
        if 'python' in n.lower():
            print(i, n)
            c = q.cmdline() 
            print(c)
            if __file__ in c:
                print('already running')
                sys.exit()
    # every process has been checked
    print('not yet running')

I also changed 'python' in n to 'python' in n.lower(), because on my system the process is called 'Python', not 'python', and this change should cover both cases.

However, when I tried this I found another problem, which is that the program finds its own process and always shuts down, even if it's the only version of itself running.

To avoid that, maybe you want to count the number of matching processes instead, and only exit if it finds more than one match.

def count_processes(name, file):
    return sum(name in q.name().lower() and file in q.cmdline() for q in psutil.process_iter())

def check_if_running():
    if count_processes('python', __file__) > 1:
        print('already running')
        sys.exit()
    else:
        print('not yet running')

Upvotes: 1

Related Questions