Reputation: 3184
There are n-number of files with vary in size. How we could efficently append the content of all the files into a single file?
Techniques or algorithm would help? Basically I am expecting efficent method to achieve this in c language.
Upvotes: 0
Views: 191
Reputation: 151
Upvotes: 0
Reputation: 360046
Start simple. Multithreading will introduce significant complexity, and won't necessarily make things run any faster. Pseudocode time:
Create a new file "dest" in write-only mode.
For each file "source" you want to append:
Open "source" in read-only mode
For each line "L" in "source":
Write "L" to "dest"
Close "source"
Close "dest"
BTW, this is dead simple (and near-optimal) to implement using simple command-line Linux tools (cat
, etc.), though of couse that isn't exactly portable to Windows. One-liner example:
for i in `find . -type f -name "*.txt"`; do cat $i >> result.out; done
(Find every .txt
file in the current directory and append it to result.out
.)
Upvotes: 3
Reputation: 72
Since I don't what the contents of the files are or the purpose of appending them, this solution might not be the best if its just text or something. However, I'd probably find a zip library to use (either licensed or open source), then just zip all the files into a single archive.
zlib looks interesting: http://www.zlib.net/
Upvotes: 1
Reputation: 210765
Go through and find the total size of all of the files.
Then allocate an output file of that size, go through them again and write the data to your output.
Upvotes: 2