Reputation: 623
I want to create a file in linux of size 128k which has data has 128k ones. What is the fastest way to do that?
Upvotes: 2
Views: 657
Reputation: 207425
This is an order of magnitude faster than anything with dd
:-)
#include <stdio.h>
#include <string.h>
#include <stdlib.h>
#define SIZE (128*1024)
int main(){
char *p=malloc(SIZE);
memset(p,'\xff',SIZE);
fprintf(stdout,p,SIZE);
}
Upvotes: 3
Reputation: 189357
Why do you require the fastest solution, and fatest by what metric? Writing the bytes to the disk is probably going to be the bottleneck anyway, so it doesn't matter really how fast you generate them, as long as it's faster than the write rate of the output medium.
dd if=/dev/zero bs=131072 count=1 |
tr '\000' '\377' >file
Different variants of tr
use slightly different syntax, so you may need to tweak this a bit.
Here's a memory-hogging but hopefully completely portable alternative;
perl -e 'printf("\xff" x 131072)' >file
Upvotes: 5