Nils
Nils

Reputation: 785

POST 4GB file from shell using cURL

I try to post a file with a filesize of 4GB to a REST API.

Instead of uploading a file with this size, cURL POSTs a file with Content-Length: 0.

curl -v -i -d @"/work/large.png" -H "Transfer-Encoding: chunked" http://localhost:8080/files
* Adding handle: conn: 0x7fcafc00aa00
* Adding handle: send: 0
* Adding handle: recv: 0
* Curl_addHandleToPipeline: length: 1
* - Conn 0 (0x7fcafc00aa00) send_pipe: 1, recv_pipe: 0
* About to connect() to localhost port 8080 (#0)
*   Trying localhost...
* Connected to localhost (localhost) port 8080 (#0)
> POST /files HTTP/1.1
> User-Agent: curl/7.30.0
> Host: localhost:8080
> Accept: */*
> Transfer-Encoding: chunked
> Authorization: bearer XXX.XXX.XXX
> x-user-token: bearer XXX.XXX.XXX
* upload completely sent off: 5 out of 0 bytes
< HTTP/1.1 201 Created
HTTP/1.1 201 Created
< Date: Thu, 02 Jan 2014 14:55:46 GMT
Date: Thu, 02 Jan 2014 14:55:46 GMT
< ETag: "d41d8cd98f00b204e9800998ecf8427e"
ETag: "d41d8cd98f00b204e9800998ecf8427e"
< Location: http://localhost:8080/files/66032e34-9490-4556-8495-fb485ca12811
Location: http://localhost:8080/files/66032e34-9490-4556-8495-fb485ca12811
* Server nginx/1.4.1 is not blacklisted
< Server: nginx/1.4.1
Server: nginx/1.4.1
< Content-Length: 0
Content-Length: 0
< Connection: keep-alive
Connection: keep-alive

Using files with a smaller size will work as expected.

-rw-r--r--  1 user1  wheel  4403200000  2 Jan 15:02 /work/large.png

Why does the upload fail? And, how to correctly upload such a file?

Cheers.

Upvotes: 13

Views: 36593

Answers (5)

Ouss
Ouss

Reputation: 3885

In my case the curl was consuming way too much memory in comparison to the file size... I noticed that the response was the issue (maybe a memory leak in curl or bash?) and I solved it by directing the curl output to a file:

curl {{command arguments and url}} > curl_response.data

that solved the issue of curl consuming too much memory.

Upvotes: 1

Dmitrii
Dmitrii

Reputation: 1

The issue of out of memory was resolved for me by adding physical memory to the server. The server consumed 7 GB of RAM, the archive took up 17 GB. 7+17=24 GB of RAM. I allocated 32 GB of RAM. The recovery went without problems.

Upvotes: -1

Benoit Delbosc
Benoit Delbosc

Reputation: 441

I think you should consider using -T option instead of --data-binary. The --data-binary loads the entire file into memory (curl 7.47). At best it is slow, at worst the OOM killer will reply with a Killed message.

curl -XPOST -T big-file.iso https://example.com

Upvotes: 31

jb.
jb.

Reputation: 23995

To upload large binary files using CURL you'll need to use --data-binary flag.

In my case it was:

 curl -X PUT --data-binary @big-file.iso https://example.com

Note: this is really an extended version of @KarlC comment, which actually is the proper answer.

Upvotes: 3

allstrives
allstrives

Reputation: 634

Did you verify that the connection is not timing out ? Check CURLOPT_POSTFIELDS has a length or size limit? Check Can't post data to rest server using cURL with content length larger than 1MB

But based on my research all I can say, is issue is at server side. Now it could be memory issue (Buffer size related ), timeout issue... and quite a lot depends on the platform you are using on server side. So, provide some details about serverside and some log output...especially try to capture the error log.

Upvotes: 0

Related Questions