Reputation: 3807
I am working on an application where the user has the potential to download thousands of files in one request into a zip file. Obviously, this will not be practical for our server. What would be the best way to go about serving up thousands of files to users?
Right now, what I have been working on is just have the jquery fileDownload library make a request for 100 files, then in the success handler call the fileDownload again for another 100 files offset by 100. The problem with this is that the fileDownload library (or the server) waits about 20 seconds until the fileDownload fail callback is called.
The other problem with this method is it isn't practical for the client to receive hundreds of pop windows asking them if they want to download 100 files.
We also won't be able to send back thousands of files in the response because our server doesn't and won't have that much memory.
Upvotes: 0
Views: 69
Reputation: 8424
This is purely opinion based on my experience but two options i have seen in use:
Option 1:
Batch process files, compress, then advise user of download location. This should be limited number of files and size tho as it can burn out the server resources. I don't recommend this if you have large number of users.
Option 2 (Best):
Batch process files into compressed file, then either enable uses to FTP into the location to obtain the files, or if your users have FTP location, have the file transfered over to the FTP location. I can tell you definitely this is most effective and is used by number of corporations i have been invovled with.
Upvotes: 2