Reputation: 7745
I am using celery to load Neural Network models and would like to store the loaded model in settings for fast prediction.
so in django.conf.settings
I have:
MODELS = {}
and in celery task, I have the following snippet:
@app.task
def load_nn_models(model_name):
from django.conf import settings
...
settings.MODELS[model_name] = {'model': net, 'graph': sess}
However, I noticed that the tasks are running in another thread that launches different Django Environment and any changes in the settings will not be reflected back to the main thread.
Is there a workaround for this?
EDIT
The parameters I am storing in settings are:
Upvotes: 0
Views: 284
Reputation: 77902
Django settings are not the right place for this, obviously. First because the settings
object is not a shared resource (there's one instance per process), then because the doc explicitely mentions that this object is to be considered as immutable.
If your point is to have a celery task computing those objects so that other tasks and / or the front can use them, you will have to find a way to serialize them and store the serialized version in a shared resource (database, cache, etc).
Upvotes: 1
Reputation:
you can try to use the configparser import configparser
def dict_from_file():
config = configparser.ConfigParser()
config.read("config.ini")
models = config['models']
for x in models.values():
print(x)
set file config.ini
:
[models]
var_a: home
var_b: car
var_c: Next
call dict_from_file
the output is:
home
car
Next
update the file config.ini
:
[models]
var_a: home
var_c: New
call dict_from_file
the output is:
home
New
ypu can read more for the supported-datatypes
Upvotes: 1