Reputation: 5283
On my Rails-based website, I'd like a user to be able to input a URL, which my server downloads using the Unix command wget
with the recursive option. My server then zips those downloaded files along with some files already on the server, and sends the zip to the user by using the Rails command send_file
.
Now, I'm worried that the send_file
will be executed before the wget
and zipping finish their job. How do I make sure that sending the file occurs only after the downloading and zipping finish? Will this approach work, or should I take an another approach?
Upvotes: 2
Views: 330
Reputation: 6062
You're better off offloading your download tasks to a worker thread (or service) and notifying the user when the download is ready.
If you wanted to be really elegant, you could have some javascript in your layout(/view/partial/etc) periodically check for completion and then notify the user/offer a download option.
The reason why you want to offload is any of the following will cause your process to fail:
Offloading allows you to queue up your requests so you can effectively manage the number of simultaneous download and processing tasks. If a download is initiated directly by user action, you better hope your app server can handle the load if several/many people start requesting files all at once. Shelling out to your OS to do work will cause memory/resource usage to skyrocket as you have to load a separate shell into memory for each process.
Upvotes: 2
Reputation: 4133
How are you interacting with wget from within the application. If you're using ruby's backtick or similar, it should wait until the command exits
def some_action
# some magic
out = `wget ....`
# zip
send_file #...
end
Upvotes: 2