Reputation: 10030
I'd like to know what is the size limitation if I upload list of files in one client's form submition using HTTP multipart content type.
On the server side I am using Spring's MultipartHttpServletRequest
to handle the request.
mM questions:
Related threads that still don't have exact answer to this:
Upvotes: 4
Views: 3622
Reputation: 8969
I have looked into Spring CommonsMultipartResolver
code. It uses apache file upload to parse multipart data. CommonsMultipartResolver
has set default buffered threshold to 10kB and uses DiskFileItemFactory
to create FileItem. If the Stream is larger than 10kB, the FileItem
get written to disk. You can also change to location of repository folder for storing temporary uploaded file items.
Is there should be different file size limitation and total request size limitation or file size is the only limitation and the request is capable of uploading 100s of files as lonng as they are not too large.
I don't what is the maximum limit of number of parts in multipart data you can send or the limit of Spring CommonsMultipartResolver
but because it uses Apache FileUpload, I assume they should be the same limit. I have used Apache FileUpload (without Spring) to recieve multipart stream bigger than 3GB and it consists ot 2 files, 20MB and 3.6GB. Although, I cannot give you the exact limit, it should be able to handle a few GBs of stream.
Doest the Spring request wrapper read the complete request and store it in the JAVA heap memory or it store temporaray files of it to be able to use big quota.
As I explained earlier, CommonsMultipartResolver
uses Apache FileUpload. The FileItem
is created using DiskFileItemFactory
so the FileItem
is save temporarily to disk. The deafult memory threshol is 10kB.
Is the use of reading the httpservlet request in streaming would change the size limitation than using complete http request read at-once by the application server.
Sorry, I am not able to answer this. What is the different between httpservlet request in streaming and http request read at-once?
What is the bottleneck of this process - Java heap size, the quota of the filesystem on which my web-server runs, the maximum allowed BLOB size that the DataBase in which I am gonna save the file alows? or Spring internal limitations?
General speed bottle neck can be sorted in the following order : Network < File IO < Memory
Java heap size can be changed before you start your servlet container. 2^64 is theoretical limit on JVM 64bits.
For quota of filesystem, it depends on filesystem, e.g. FAT32 allows you to have maximum file size of 4GB, NTFS 16TB, EX3 16GB-2TB, etc.
Accoring to ibm, BLOB limit should be 2GB.
I am not aware of Spring internal limit. But the Apache FileUpload should be able to deal with a few GB of stream without any problem.
Upvotes: 2