TeAmEr
TeAmEr

Reputation: 4773

Sending large data from a server to another

I am using CURL to send large amounts of data between servers , am using POST , is this OK or is there any better/standard way to send large serialized data with curl ?

the problem is in the max-post-size in the php settings , i have to change it (default 2MB) . i didn't encounter any problems with this yet , but when the system will be online it is possible that data larger than 50MB will be sent each time !

Any ideas ? Thank you .

EDIT :

I am sending DATA and not FILES , a data that once received should be processed by the second server and saved to database/file/do some action and might need to send a response after processing the data .

I just would like to know , will i face any other problem except max-post-size ? (forget about timeouts of both curl and php) , and is there anyway to make the server not look at max_post_size ? maybe by using PUSH ? or PUT ? does that post_size affect the PUSH or PUT ?? and how to use it via curl ? so many questions !

Upvotes: 0

Views: 1884

Answers (2)

uzyn
uzyn

Reputation: 6683

Using cURL is perfectly fine.

Personally, I would prefer to not having to do it through web server (eg. Apache) as there can be too many potential faults along the way, eg. PHP timeout, web server timeout, memory limit, no write privileges, limited to web root, etc.

I would prefer to do it through mechanisms designed for file transfers:

  • FTP
  • scp (generally FTP over SSH)
  • Dropbox (there are APIs)
  • Amazon S3 (simple API with PHP library)
  • etc.

Upvotes: 2

donald123
donald123

Reputation: 5739

The way is ok.

Two more ideas for you:

  1. Use FTP (you can upload large serialized files to a ftp-server which is reachable from your servers
  2. Use mysql (you can store the large serialized content on a mysql-server)

Upvotes: 0

Related Questions