Reputation: 1325
I made a management command that populates one of my models from a csv file.
I need to do this update quite frequently and the csv files have tens of thousands of lines.
Sometimes it can take over 10 minutes to finish populating.
I want to add a feature that lets me upload the csv file through the website directly and after the file is uploaded, django should run that command, or at least the logic from it, and populate the db.
How would I go about this? I want to be able to leave the page after I upload the file and receive an e-mail when the task is finished.
Upvotes: 2
Views: 793
Reputation: 10116
You can do the same with Django Background Task. Its a databased-backed work queue for Django. And is easy to implement than Celery.
from background_task import background
@background(schedule=60)
def your_task():
# do your cool work here.
This will convert the your_task
into a background task function. When you call it from regular code it will actually create a Task object and stores it in the database.
Upvotes: 2
Reputation: 12983
Use Celery
Rougly, it may look like this:
app = Celery(<config stuff here>)
@app.task(name='my_task')
def my_task(self):
do_stuff()
def my_view(*args, **kwargs):
result = process_request()
app.send_task('my_task')
You'll need to create the task, register it with celery (there is some autodiscover magic you can use), then run the task asynchronously from your django app.
In production, you may want to run celery as a daemon process with celeryd
Upvotes: 1