Leigh Riffel
Leigh Riffel

Reputation: 6641

Quickly create large file on a Windows system

In the same vein as Quickly create a large file on a Linux system, I'd like to quickly create a large file on a Windows system. By large I'm thinking 5 GB. The content doesn't matter. A built-in command or short batch file would be preferable, but I'll accept an application if there are no other easy ways.

Upvotes: 311

Views: 437154

Answers (24)

DLT
DLT

Reputation: 440

In PowerShell...

$file = [System.IO.File]::Create("$pwd\1GB.dat")
$file.SetLength(1GB)
$file.Close()

Upvotes: 5

CKuharski
CKuharski

Reputation: 314

... 1 MB file dummy.txt within few seconds.

echo "This is just a sample line appended to create a big file.. " > dummy.txt 
for /L %i in (1,1,14) do type dummy.txt >> dummy.txt

See here : http://www.windows-commandline.com/how-to-create-large-dummy-file/

Upvotes: 2

Patrick Cuff
Patrick Cuff

Reputation: 29836

fsutil file createnew <filename> <length>

where <length> is in bytes.

For example, to create a 1MB (Windows MB or MiB) file named 'test', this code can be used.

fsutil file createnew test 1048576

fsutil requires administrative privileges though.

Upvotes: 510

Meghdad
Meghdad

Reputation: 179

You can use the cat powershell command.

First create a simple text file with a few characters. The more initial chars you enter, the quicker it becomes larger. Let's call it out.txt. Then in Powershell:

cat out.txt >> out.txt

Wait as long as it's necessary to make the file big enough. Then hit ctrl-c to end it.

Upvotes: 2

lkurts
lkurts

Reputation: 378

Use:

/*
Creates an empty file, which can take all of the disk
space. Just specify the desired file size on the
command line.
*/

#include <windows.h>
#include <stdlib.h>

int main (int argc, char* ARGV[])
{
    int size;
    size = atoi(ARGV[1]);
    const char* full = "fulldisk.dsk";
    HANDLE hf = CreateFile(full,
                           GENERIC_WRITE,
                           0,
                           0,
                           CREATE_ALWAYS,
                           0,
                           0);
    SetFilePointer(hf, size, 0, FILE_BEGIN);
    SetEndOfFile(hf);
    CloseHandle(hf);
    return 0;
}

Upvotes: 6

Bill W
Bill W

Reputation: 31

Plain ol' C... this builds under MinGW GCC on Windows XX and should work on any 'generic' C platform.

It generates a null file of a specified size. The resultant file is NOT just a directory space-occupier entry, and in fact occupies the specified number of bytes. This is fast because no actual writes occur except for the byte written before close.

My instance produces a file full of zeros - this could vary by platform; this program essentially sets up the directory structure for whatever data is hanging around.

#include <stdio.h>
#include <stdlib.h>

FILE *file;

int main(int argc, char **argv)
{
    unsigned long  size;

    if(argc!=3)
    {
        printf("Error ... syntax: Fillerfile  size  Fname \n\n");
        exit(1);
    }

    size = atoi(&*argv[1]);

    printf("Creating %d byte file '%s'...\n", size, &*argv[2]);

    if(!(file = fopen(&*argv[2], "w+")))
    {
        printf("Error opening file %s!\n\n", &*argv[2]);
        exit(1);
    }

    fseek(file, size-1, SEEK_SET);
    fprintf(file, "%c", 0x00);
    fclose(file);
}

Upvotes: 3

jkdba
jkdba

Reputation: 2519

Simple answer in Python: If you need to create a large real text file I just used a simple while loop and was able to create a 5 GB file in about 20 seconds. I know it's crude, but it is fast enough.

outfile = open("outfile.log", "a+")

def write(outfile):
    outfile.write("hello world hello worldhello worldhello worldhello worldhello worldhello worldhello worldhello worldhello worldhello worldhello worldhello worldhello worldhello worldhello worldhello worldhello worldhello worldhello worldhello worldhello worldhello worldhello worldhello worldhello worldhello worldhello worldhello worldhello worldhello worldhello worldhello worldhello worldhello worldhello worldhello worldhello worldhello worldhello worldhello worldhello worldhello worldhello worldhello worldhello worldhello worldhello worldhello worldhello worldhello worldhello worldhello worldhello worldhello worldhello worldhello worldhello worldhello worldhello worldhello worldhello worldhello worldhello worldhello worldhello worldhello worldhello worldhello worldhello worldhello worldhello worldhello worldhello worldhello worldhello worldhello worldhello worldhello worldhello worldhello worldhello worldhello worldhello worldhello worldhello worldhello worldhello worldhello worldhello worldhello worldhello worldhello worldhello worldhello worldhello worldhello worldhello worldhello world"+"\n")
    return

i=0
while i < 1000000:
    write(outfile)
    i += 1
outfile.close()

Upvotes: 1

NEt_Bomber
NEt_Bomber

Reputation: 21

You can try this C++ code:

#include<stdlib.h>
#include<iostream>
#include<conio.h>
#include<fstream>
#using namespace std;

int main()
{
    int a;
    ofstream fcout ("big_file.txt");
    for(;;a += 1999999999){
        do{
            fcout << a;
        }
        while(!a);
    }
}

Maybe it will take some time to generate depending on your CPU speed...

Upvotes: 2

GabrielB
GabrielB

Reputation: 141

Another GUI solution : WinHex.

“File” > “New” > “Desired file size” = [X]
“File” > “Save as” = [Name]

Contrary to some of the already proposed solutions, it actually writes the (empty) data on the device.

It also allows to fill the new file with a selectable pattern or random data :

“Edit” > “Fill file” (or “Fill block” if a block is selected)

Upvotes: 1

MusikPolice
MusikPolice

Reputation: 1749

I found an excellent utility that is configurable at https://github.com/acch/genfiles.

It fills the target file with random data, so there are no problems with sparse files, and for my purposes (testing compression algorithms) it gives a nice level of white noise.

Upvotes: 4

yosh m
yosh m

Reputation: 193

The simplest way I've found is this free utility: http://www.mynikko.com/dummy/

Creates dummy files of arbitrary size that are either filled with spaces or are filled with non-compressible content (your choice). Here's a screenshot:

enter image description here

Upvotes: 7

chersun
chersun

Reputation: 402

I was looking for a way to create a large dummy file with space allocation recently. All of the solutions look awkward. Finally I just started the DISKPART utility in Windows (embedded since Windows Vista):

DISKPART
CREATE VDISK FILE="C:\test.vhd" MAXIMUM=20000 TYPE=FIXED

Where MAXIMUM is the resulting file size, 20 GB here.

Upvotes: 10

Rod
Rod

Reputation: 1571

PowerShell one-liner to create a file in C:\Temp to fill disk C: leaving only 10 MB:

[io.file]::Create("C:\temp\bigblob.txt").SetLength((gwmi Win32_LogicalDisk -Filter "DeviceID='C:'").FreeSpace - 10MB).Close

Upvotes: 10

shenku
shenku

Reputation: 12468

Open up Windows Task Manager, find the biggest process you have running right click, and click on Create dump file.

This will create a file relative to the size of the process in memory in your temporary folder.

You can easily create a file sized in gigabytes.

Enter image description here

Enter image description here

Upvotes: 19

ZXX
ZXX

Reputation: 4770

Check out RDFC http://www.bertel.de/software/rdfc/index-en.html

RDFC is probably not the fastest, but it does allocate data blocks. The absolutely fastest would have to use lower level API to just obtain cluster chains and put them into MFT without writing data.

Beware that there's no silver bullet here - if "creation" returns instantly that means you got a sparse file which just fakes a large file, but you won't get data blocks/chains till you write into it. If you just read is you'd get very fast zeros which could make you believe that your drive all of the sudden got blazingly fast :-)

Upvotes: 18

gimel
gimel

Reputation: 86502

Short of writing a full application, us Python guys can achieve files of any size with four lines, same snippet on Windows and Linux (the os.stat() line is just a check):

>>> f = open('myfile.txt','w')
>>> f.seek(1024-1) # an example, pick any size
>>> f.write('\x00')
>>> f.close()
>>> os.stat('myfile.txt').st_size
1024L
>>>

Upvotes: 6

Alan Pallath
Alan Pallath

Reputation: 99

I've made some additions to the same fsutil method as mentioned in the chosen answer. This is to create files of many different extensions and/or of various sizes.

set file_list=avi bmp doc docm docx eps gif jpeg jpg key m4v mov mp4 mpg msg nsf odt pdf png pps ppsx ppt pptx rar rtf tif tiff txt wmv xls xlsb xlsm xlsx xps zip 7z
set file_size= 1
for %%f in (%file_list%) do (
fsutil file createnew valid_%%f.%%f %file_size%
) > xxlogs.txt

The code can be cloned from https://github.com/iamakidilam/bulkFileCreater.git

Upvotes: -1

Joey
Joey

Reputation: 354864

You can use the Sysinternals Contig tool. It has a -n switch which creates a new file of a given size. Unlike fsutil, it doesn't require administrative privileges.

Upvotes: 42

Noctis Skytower
Noctis Skytower

Reputation: 22041

Quick to execute or quick to type on a keyboard? If you use Python on Windows, you can try this:

cmd /k py -3 -c "with open(r'C:\Users\LRiffel\BigFile.bin', 'wb') as file: file.truncate(5 * 1 << 30)"

Upvotes: 1

Giri
Giri

Reputation: 512

I was searching for a way to generate large files with data, not just sparse file. Came across the below technique:

If you want to create a file with real data then you can use the below command line script.

echo "This is just a sample line appended to create a big file.. " > dummy.txt
for /L %i in (1,1,14) do type dummy.txt >> dummy.txt

(Run the above two commands one after another or you can add them to a batch file.)

The above commands create a 1 MB file dummy.txt within few seconds...

Upvotes: 27

Florian Feldhaus
Florian Feldhaus

Reputation: 5942

Temp files should be stored in the Windows Temp Folder. Based on the answer from Rod you can use the following one liner to create a 5 GB temp file which returns the filename

[System.IO.Path]::GetTempFileName() | % { [System.IO.File]::Create($_).SetLength(5gb).Close;$_ } | ? { $_ }

Explanation:

  • [System.IO.Path]::GetTempFileName() generates a random filename with random extension in the Windows Temp Folder
  • The Pipeline is used to pass the name to [System.IO.File]::Create($_) which creates the file
  • The file name is set to the newly created file with .SetLength(5gb). I was a bit surprised to discover, that PowerShell supports Byte Conversion, which is really helpful.
  • The file handle needs to be closed with .close to allow other applications to access it
  • With ;$_ the filename is returned and with | ? { $_ } it is ensured that only the filename is returned and not the empty string returned by [System.IO.File]::Create($_)

Upvotes: 4

f.ardelian
f.ardelian

Reputation: 6956

I needed a regular 10 GB file for testing, so I couldn't use fsutil because it creates sparse files (thanks @ZXX).

@echo off

:: Create file with 2 bytes
echo.>file-big.txt

:: Expand to 1 KB
for /L %%i in (1, 1, 9) do type file-big.txt>>file-big.txt

:: Expand to 1 MB
for /L %%i in (1, 1, 10) do type file-big.txt>>file-big.txt

:: Expand to 1 GB
for /L %%i in (1, 1, 10) do type file-big.txt>>file-big.txt

:: Expand to 4 GB
del file-4gb.txt
for /L %%i in (1, 1, 4) do type file-big.txt>>file-4gb.txt

del file-big.txt

I wanted to create a 10 GB file, but for some reason it only showed up as 4 GB, so I wanted to be safe and stopped at 4 GB. If you really want to be sure your file will be handled properly by the operating system and other applications, stop expanding it at 1 GB.

Upvotes: 16

Byron Whitlock
Byron Whitlock

Reputation: 53921

Check the Windows Server 2003 Resource Kit Tools. There is a utility called Creatfil.

 CREATFIL.EXE
 -? : This message
 -FileName -- name of the new file
 -FileSize -- size of file in KBytes, default is 1024 KBytes

It is the similar to mkfile on Solaris.

Upvotes: 14

Leigh Riffel
Leigh Riffel

Reputation: 6641

I found a solution using DEBUG at http://www.scribd.com/doc/445750/Create-a-Huge-File, but I don't know an easy way to script it and it doesn't seem to be able to create files larger than 1 GB.

Upvotes: 3

Related Questions