Reputation: 85
I need to send a tar.gzip file from one java app (via a Servlet) to another - I'm using HTTP client with a MultipartEntity to achieve this.
During the file transfer, the file seems to double in size - as if it's being decompressed - and it's no longer recognizable as either a tar.gz or tar file.
Here's the send method:
HttpClient http = new DefaultHttpClient();
HttpPost post = new HttpPost(url);
MultipartEntity multipart = new MultipartEntity();
ContentBody fileContent = new FileBody(file, "application/octet-stream");
ContentBody pathContent = new StringBody(file.getAbsolutePath());
multipart.addPart("package", fileContent);
multipart.addPart("path", pathContent);
post.setEntity(multipart);
HttpResponse response = null;
try {
response = http.execute(post);
StringWriter sw = new StringWriter();
IOUtils.copy(response.getEntity().getContent(), sw);
} catch (Exception ex){
log.error("Unable to POST to ["+url+"].",ex);
}
return result;
Here is the servlet method the above code is POSTing to:
@Override
protected void doPost(HttpServletRequest req, HttpServletResponse resp) throws ServletException, IOException {
log.info("File transfer request received, collecting file information and saving to server.");
Part filePart = req.getPart("package");
Part filePathPart = req.getPart("path");
StringWriter sw = new StringWriter();
IOUtils.copy(filePathPart.getInputStream(), sw);
String path = sw.getBuffer().toString();
File outputFile = new File(path);
FileWriter out = new FileWriter(outputFile);
IOUtils.copy(filePart.getInputStream(), out);
log.info("File ["+path+"] has been saved to the server.");
out.close();
sw.close();
}
I'm no expert on this stuff - and there doesn't appear to be much help via Google... Any help would be great.
Thanks, Pete
Upvotes: 2
Views: 4144
Reputation: 1108537
Your concrete problem is caused because you're converting the incoming bytes to characters by using FileWriter
instead of FileOutputStream
here:
FileWriter out = new FileWriter(outputFile);
ZIP files are binary files, represented by a specific sequence of bytes, not character files like text, HTML, XML, etc. With converting bytes to characters this way, you're only malforming the original binary content which causes that the file is not recognizeable as a ZIP file anymore. You end up with a corrupted file.
If you use FileOutputStream
instead, then your problem will be solved. There's absolutely no need to replace this all with Commons FileUpload.
Unrelated to the concrete problem, reusing a client side specific absolute path in the server side is not a good idea for security reasons, but you'll find out that sooner or later. Rather reuse at highest the filename, preferably in combination with File#createTempFile()
to autogenerate an unique filename suffix.
Upvotes: 3
Reputation: 85
I made this work by using Apache commons File Upload:
Send code:
HttpClient http = new DefaultHttpClient();
HttpPost post = new HttpPost(url);
post.addHeader("path", file.getAbsolutePath());
MultipartEntity multipart = new MultipartEntity();
ContentBody fileContent = new FileBody(file); //For tar.gz: "application/x-gzip"
multipart.addPart("package", fileContent);
post.setEntity(multipart);
Receive code:
@Override
protected void doPost(HttpServletRequest req, HttpServletResponse resp) throws ServletException, IOException {
log.info("File transfer request received, collecting file information and saving to server.");
FileItemFactory factory = new DiskFileItemFactory();
ServletFileUpload upload = new ServletFileUpload(factory);
try {
List fileItems = upload.parseRequest(req);
Iterator iterator = fileItems.iterator();
if (iterator.hasNext()){
FileItem fileItem = (FileItem) iterator.next();
File file = new File(req.getHeader("path"));
fileItem.write(file);
log.info("File ["+fileItem.getName()+"] has been saved to the server.");
}
} catch (Exception ex) {
log.error("Unable to retrieve or write file set...",ex);
}
}
Upvotes: 2