redstone
redstone

Reputation: 63

AFNetworking 2.0 sends parameters and image to server

I am using AFNetworking2 to send parameters and image to server.

    manager.POST(urlPath, parameters: parameters, constructingBodyWithBlock: { (formData: AFMultipartFormData!) -> Void in
            formData.appendPartWithFileData(imageData!, name: "image", fileName: "dummy.jpg", mimeType: "image/jpeg")
        }, success: { (dataTask: NSURLSessionDataTask!, responseObject: AnyObject!) -> Void in
            println("success: \(responseObject.description)")

        }, failure: { (dataTask: NSURLSessionDataTask!, error: NSError!) -> Void in
            println("failure: \(error)")
    })


On server side, the data will be a dictionary merged by parameters(QueryDict) and the image data(MultiValueDict):

data=MergeDict(<QueryDict: {u'owner': [u'6'], u'description': [u'this
is p1'], u'name': [u'p1']}>, <MultiValueDict: {u'image':
[<InMemoryUploadedFile: file.jpg (image/jpeg)>]}>)


I reckon the 'MultiValueDict' is from this part of code:
formData.appendPartWithFileData(imageData!, name: "image", fileName: "dummy.jpg", mimeType: "image/jpeg")

However, I wanted to have MultiValueDict like this: {u'groupImages': [{u'image': [<InMemoryUploadedFile: file.jpg (image/jpeg)>]}]}

The data format is a Dictionary with an array value, and the array has another Dictionary value.

So what can I do to make formData.appendPartWithFileData become such above data format?


EDIT:

I have seen some posts similar to my question. For example this one: AFNetworking post image in nested json

I have tried to change my code like this:

formData.appendPartWithFileData(imageData!, name: "groupImages[0].image", fileName: "dummy.jpg", mimeType: "image/jpeg")

or

formData.appendPartWithFileData(imageData!, name: "groupImages[0][image]", fileName: "dummy.jpg", mimeType: "image/jpeg")

but none of them worked for me.

My server expects to receive a JSON like this:

{
    "name": "p2",
    "owner": 6,
    "description": "this is p2",
    "groupImages": [{
        "image": <InMemoryUploadedFile: dummy.jpg (image/jpeg)>
    }]
} 

Any idea?

Upvotes: 4

Views: 907

Answers (1)

Aaron Brager
Aaron Brager

Reputation: 66242

When you send a multipart HTTP request, the JSON data and the image data are separate - literally in multiple parts. It might be worth taking a look at this answer to "What is HTTP Multipart Request" so you can see how the data is transmitted.

The long and short of it is that the JSON and the image are merged into a dictionary on the server end. The image is not transmitted embedded within the JSON. The semantics of how they're merged (for example, how a name like groupImages[0][image] is used to merge in with the JSON dictionary) are determined by the server, not by your iOS app.

So, your server team should be able to specify how you name this file so that it's merged with the dictionary correctly. They should be able to produce a sample HTTP request that works properly (for example, using curl or Postman). If your server has a web app, you could inspect the analogous function in the web app to see what the request looks like there. Once you have a working request, you can mimic it by comparing your outgoing NSURLRequest to the sample request they provide.

Upvotes: 2

Related Questions