Reputation: 2860
I am working with Spring Boot 1.2.5 and I would like to upload a raw binary file to a controller. The file size may be large so I do not want to hold the whole request in memory but instead stream the file, in fact the file is being generated as transmission start so the client doesn't even know the size of the file. I see an example of how to do something similar with a multipart encoded file upload here. However, I do not want a multipart encoded upload, just a raw stream of bytes. I can't seem to find a way to handle this use-case in spring.
Upvotes: 12
Views: 19012
Reputation: 13009
To upload large files that don't block your MVC request thread pool or use up more memory than you have in your JVM you can use a combination of accepting the HttpServletRequest
(or an InputStream
) and then receiving it efficiently using NIO within a CompletableFuture
.
Here's a sample controller to get you started. Before using this in a real scenario you will want to make sure that any filename that you write to is strongly validated before you do it.
@RestController
public class TestService {
@PostMapping("/upload/{filename:.+}")
public CompletableFuture<ResponseEntity<?>> upload(HttpServletRequest request, @PathVariable("filename") String filename)
throws ServiceUnavailableException, NotFoundException {
final int MAX_BUFFER_SIZE = 1024 * 128;
// TODO: validate 'filename' to ensure it's legal and will be written where you want it
// to be within the file system. Watch out for the many security gotchas.
// asynchronously accept the upload
return CompletableFuture.supplyAsync(() -> {
try {
// TODO: Change this to where you want the file to be written
Path file = Paths.get(filename);
try (ReadableByteChannel inChannel = Channels.newChannel(request.getInputStream())) {
try (WritableByteChannel outChannel = Files.newByteChannel(file, CREATE, TRUNCATE_EXISTING, WRITE)) {
// no way to free a ByteBuffer manually - GC does it
ByteBuffer buffer = ByteBuffer.allocateDirect(MAX_BUFFER_SIZE);
while (inChannel.read(buffer) != -1) {
buffer.flip();
outChannel.write(buffer);
buffer.compact();
}
// EOF will leave buffer in fill state, flip it and write anything remaining
buffer.flip();
while (buffer.hasRemaining()) {
outChannel.write(buffer);
}
}
}
} catch (IOException ex) {
// TODO: log the exception because spring doesn't seem to do that
throw new ResponseStatusException(INTERNAL_SERVER_ERROR, "Failed to upload the file", ex);
}
// upload completed successfully
return ResponseEntity.ok().build();
});
}
}
Upvotes: 2
Reputation: 519
I want to share some small discoveries that might help someone.
I was using spring's MultipartFile
to upload large files and was concerned that spring would store the contents in memory. So, I decided to use the getInputStream()
method, hoping that this would stream the file directly to the desired location:
@PostMapping("/upload")
public ResponseEntity<?> uploadFile(@RequestPart MultipartFile file) throws FileNotFoundException, IOException{
FileCopyUtils.copy(file.getInputStream(), new FileOutputStream(new File("/storage/upload/", file.getOriginalFilename())));
return ResponseEntity.ok("Saved");
}
When I tested the controller with a 2GB file, it was taking a long time to hit the controller method. So I debugged and found out that spring/Tomcat first stores the file in a temporary folder before handling it to the controller. This means that, when you call getInputStream()
it returns a FileInputStream
pointing to the file stored on filesystem instead of streaming directly from the client browser.
In another words, calling FileCopyUtils.copy()
is slow, because it copies the entire file to another location and then delete the temporary file, making it take twice the time to complete the request.
I investigated and discovered that you can disable the spring features and handle the multipart requests manually, but is kind complicated and error prone. So, digging a little more, I found out that the MultipartFile
has a method called transferTo
that actually moves the temporary file to the desired location. I tested it and it was instantaneous. My code got like this:
@PostMapping("/upload")
public ResponseEntity<?> uploadFile(@RequestPart MultipartFile file) throws FileNotFoundException, IOException{
file.transferTo(new File("/storage/upload/", file.getOriginalFilename()));
return ResponseEntity.ok("Saved");
}
Conclusion, if all you want is to upload the file to a specific directory/file you can just use this solution and it will be as fast as streaming the file manually.
IMPORTANT: there are two transferTo()
methods, one that receives a Path
and another one that receives a File
. Don't use the one that receives a Path
because it will copy the file and be slow.
EDIT1:
I tested the solution using the HttpServletRequest
, but it will still store a temporary file unless you set the spring config spring.servlet.multipart.enabled = false
. The same occurs for solutions using MultipartHttpServletRequest
.
I see three main benefits using the solution I found:
@RequestPart MultipartFile
to your controller methodpublic ResponseEntity<?> uploadFile(@RequestPart @Valid MyCustomPOJO pojo, @RequestPart MultipartFile file1, @RequestPart MultipartFile file2, @RequestPart MultipartFile file3)
Here is the URL for a test project that I've created to test some concepts, including this one:
https://github.com/noschang/SpringTester
Upvotes: 14
Reputation: 8310
You can just consume the HttpServletRequest
inputstream.
Just be aware that if you have any filters that pre process the request and consume the inputstream then this might not work.
@ResponseBody
@RequestMapping(path="fileupload", method = RequestMethod.POST, consumes = MediaType.APPLICATION_OCTET_STREAM_VALUE)
public void fileUpload(HttpServletRequest request) throws IOException {
Files.copy(request.getInputStream(), Paths.get("myfilename"));
}
Upvotes: 13