Reputation: 45
I'm trying to upload a file to Azure storage through azure function. I was successful in uploading plain text file but the files are getting corrupted for any other type of files. What I observed is that the bytes that I'm receiving are lesser than the actual size(bodyLength < contentLength).
I tried to change the request data type to
HttpRequestMessage<'Optional<'byte[]>>
HttpRequestMessage and Byte[]which is throwing cannot conver to string error as reported in byte[] input broken #239[^]
@FunctionName("UploadFile")
public HttpResponseMessage run(@HttpTrigger(name = "req", methods = { HttpMethod.GET,
HttpMethod.POST }, authLevel = AuthorizationLevel.FUNCTION) HttpRequestMessage(Optional(String>> request,
final ExecutionContext context)
throws InvalidKeyException, URISyntaxException, StorageException, IOException {
CloudStorageAccount storageAccount = CloudStorageAccount.parse(_storageConnString);
CloudBlobClient blobClient = storageAccount.createCloudBlobClient();
CloudBlobContainer blobContainer = blobClient.getContainerReference(_containerName);
CloudBlockBlob blob = blobContainer.getBlockBlobReference(fileName);
try {
String body = request.getBody().get();
long bodyLength = body.length();
String contentLength = request.getHeaders().get("content-length");
InputStream inputStream = new ByteArrayInputStream(body.getBytes());
blob.upload(inputStream, Integer.parseInt(bodyLength));
} catch (Exception ex) {
return request.createResponseBuilder(HttpStatus.BAD_REQUEST).body(ex.getMessage()).build();
}
return request.createResponseBuilder(HttpStatus.OK).body("File uploaded successfully").build();
}
My requirement is to upload large files to storage through azure functions. Any help can be appreciated.
Upvotes: 2
Views: 3842
Reputation: 5549
Checked the official tutorial, but sadly, nothing helps. The documentation is too old, and there are even some errors in it.
By testing locally, I found that azure java function could not handle different contents as expected even sometimes you set the correct content-type header.
In most case it will treat the contents as plain text. In most case, that would be no problem, if there is no unreadable special characters.
However, for other files, such as: image files, binary files and other files which may contains unreadable special characters, it seems that Azure function just ignores some unreadable characters.
To solve this, you can set the request content-type header as application/octet-stream
which means that it is an unknown format. In this way, Azure function will keep all the content bytes.
However, George's way (using multipart/form-data) can also work. Because if the content-type is multipart/form-data, system will also keep all the contents (this is the trick), so that you can get each part with a defined boundary string. The usage scenario for this method is to upload file in http forms. You just need to set enctype to "multipart/form-data"
<FORM method="POST" action="....." enctype="multipart/form-data">
<INPUT type="file" name="pic">
<INPUT type="submit" value="submit" name="submit">
</FORM>
So, if you just want to post a file to Azure function by programming, set the content-type to application/octet-stream.
Upvotes: 0
Reputation: 14334
After testing, it's caused by the request header. From this similar answer, you could find have to send the body with multipart/form-data
header. If without the header, I will get the result same as yours.
After setting the content-type
header, the length will be same.
However even though the length is equal, if you send files not txt file but pdf or image etc. there is an issue of Java Azure Function that some bytes will be lost, it will be the same situation like the OP update in the answer I refer.
So if you upload other type file, the solution is to convert the file to base64 string to post to Function and then in the function convert base64 string then upload to blob.
byte[] scanBytes = Base64.getDecoder().decode(base64 string from request);
blob.UploadFromByteArray(scanBytes , 0, scanBytes.Length);
Node: If you want to post large file, you should know there is a max request size(100MB) and this is not able to change it. Github issue. However in local environment the most statement is 10MB~20MB, I test local occur this limit. So better don't post too large file.
Upvotes: 1
Reputation: 14108
This seems because you need to upload the original stream to blob storge. I used to upload with turn it to byte[], but also failed. I used C# and meet the same problem as yours. The problem is in this place:
String body = request.getBody().get();
InputStream inputStream = new ByteArrayInputStream(body.getBytes());
blob.upload(inputStream, Integer.parseInt(bodyLength));
In C#, I use blob.UploadFromStreamAsync(req.Body);
and then the problem disappeared. You can check if Java has a similar method.
Upvotes: 0