keynesiancross
keynesiancross

Reputation: 3529

Python: Event-Handler for background task complete

I have a _global_variable = Big_Giant_Class(). Big_Giant_Class takes a long time to run, but it also has constantly refreshing 'live-data' behind it, so I always want as new a instance of it as possible. Its not IO-bound, just a load of CPU computations.

Further, my program has a number of functions that reference that global instance of Big_Giant_Class.

I'm trying to figure out a way to create Big_Giant_Class in an endless loop (so I always have the latest-and-greatest!), but without it being blocking to all the other functions that reference _global_variable.

Conceptually, I kind of figure the code would look like:

import time
class Big_Giant_Class():
    def __init__(self, val, sleep_me = False):
        self.val = val
        if sleep_me:
            time.sleep(10)
    
    def print_val(self):
        print(self.val)    

async def run_loop():    
    while True:
        new_instance_value = await asyncio.run(Big_Giant_Class(val = 1)) # <-- takes a while
        # somehow assign new_instance_value to _global_variable when its done!        

def do_stuff_that_cant_be_blocked():
    global _global_variable    
    return _global_variable.print_val()

_global_variable = Big_Giant_Class(val = 0)

if __name__ == "__main__":
    asyncio.run(run_loop())  #<-- maybe I have to do this somewhere?
    
    for i in range(20):
        do_stuff_that_cant_be_blocked()
        time.sleep(1)        
        
Conceptual Out:
    0
    0
    0
    0
    0
    0
    0
    0
    0
    0
    0
    1
    1
    1
    1
    1
    1
    1
    1
    1
    1
    1

The kicker is, I have a number of functions [ie, do_stuff_that_cant_be_blocked] that can't be blocked.

I simply want them to use the last _global_variable value (which gets periodically updated by some unblocking...thing?). Thats why I figure I can't await the results, because that would block the other functions?

Is it possible to do something like that? I've done very little asyncio, so apologies if this is basic. I'm open to any other packages that might be able to do this (although I dont think Trio works, because I have incompatible required packages that are used)

Thanks for any help in advance!

Upvotes: 0

Views: 755

Answers (1)

CallMePhil
CallMePhil

Reputation: 1657

So you have two cpu bound "loops" in your program. Python has a quirky threading model. So first off python CANNOT do two things at once, meaning calculations. Threading and async allow python to fake doing two things.

Threading allows you to "do" two things cause python switches between the threads and does work but doesnt run both at the same time

Async allows you to "do" two things if you can await the operation. While python awaits the operation it can jump and do something else. However awaiting a cpu bound operation will not allow it to jump and do other things.

The easiest solution is to use a thread though there will be some time where both loops are blocking cause work is being done on the other. But work will be split about 50/50 between threads.

from threading import Thread

_global_variable = some_initial_value


def update_global():
    global _global_variable

    while True:
        _global_variable = get_new_global_instance()
        call_some_event()


def main():
    background_thread = Thread(target=update_global, daemon=True)
    background_thread.start()

    while True:
        do_important_work()

The harder but truly multiprocessing version would be to use a Process instead of a Thread but would also need to use either shared state or a queue or something like that. https://docs.python.org/3/library/multiprocessing.html#sharing-state-between-processes

Upvotes: 1

Related Questions