Reputation: 3114
I have this home work where I have to transfer a very big file from one source to multiple machines using bittorrent kinda of algorithm. Initially I am cutting the files in to chunks and I transfer chunks to all the targets. Targets have the intelligence to share the chunks they have with other targets. It works fine. I wanted to transfer a 4GB file so I tarred four 1GB files. It didn't error out when I created the 4GB tar file but at the other end while assembling all the chunks back to the original file it errors out saying file size limit exceeded. How can I go about solving this 2GB limitation problem?
Upvotes: 8
Views: 10871
Reputation: 31281
You should use fseeko and ftello, see fseeko(3)
Note you should define #define _FILE_OFFSET_BITS 64
#define _FILE_OFFSET_BITS 64
#include <stdio.h>
Upvotes: 1
Reputation: 73762
I can think of two possible reasons:
gcc -D_FILE_OFFSET_BITS=64
)Upvotes: 11
Reputation: 77251
This depends on the filesystem type. When using ext3, I have no such problems with files that are significantly larger.
If the underlying disk is FAT, NTFS or CIFS (SMB), you must also make sure you use the latest version of the appropriate driver. There are some older drivers that have file-size limits like the ones you experience.
Upvotes: 4
Reputation: 67909
If your system supports it, you can get hints with: man largefile
.
Upvotes: 1
Reputation: 1329822
Could this be related to a system limitation configuration ?
$ ulimit -a
vi /etc/security/limits.conf
vivek hard fsize 1024000
If you do not want any limit remove fsize
from /etc/security/limits.conf
.
Upvotes: 3