Squirvin
Squirvin

Reputation: 129

Program slowing down due to recursion

I'm trying to write a program that adds every single file and folder name on my C: drive to an ArrayList. The code works fine, but because of the massive amount of recursion, it gets painfully slow. Here is the code:

public static void updateFileDataBase()
{
    ArrayList<String> currentFiles = new ArrayList<String>();
    addEverythingUnder("C:/",currentFiles,new String[]{"SteamApps","AppData"});
    for(String name : currentFiles)
        System.out.println(name);
}
private static void addEverythingUnder(String path, ArrayList<String> list, String[] exceptions)
{
    System.gc();
    System.out.println("searching " + path);
    File search = new File(path);
    try
    {
        for(int i = 0; i < search.list().length; i++)
        {
            boolean include = true;
            for(String exception : exceptions)
                if(search.list()[i].contains(exception))
                    include = false;
            if(include)
            {
                list.add(search.list()[i]);
                if(new File(path + "/" + search.list()[i]).isDirectory())
                {
                    addEverythingUnder(path + "/" + search.list()[i],list,exceptions);
                }
            }
        }
    }
    catch(Exception error)
    {
        System.out.println("ACCESS DENIED");
    }
}

I was wondering if there was anything at all that I could do to speed up the process. Thanks in advance :)

Upvotes: 0

Views: 299

Answers (4)

Peter Lawrey
Peter Lawrey

Reputation: 533492

because of the massive amount of recursion, it gets painfully slow

While your code is very inefficient as EJP suggests, I suspect the problem is even more basic. When you access a large number of files, this takes time to read from disk (the first time, reading the same again, and again is much quicker as it is cache) Opening files is also pretty slow for a HDD.

A typical HDD has a seek time of 8 ms, if finding and opening a file takes two operations, then you are looking at 16 ms per file. say you have 10,000 files, this will take at least 160 seconds, no matter how efficient you make the code. BTW If you use a decent SSD, this will take about 1 second.

In short, you are likely to be hitting a hardware limit which has nothing to do with how you wrote your software. Shorter: Don't have large numbers of files if you want performance.

Upvotes: 1

Steven Schlansker
Steven Schlansker

Reputation: 38526

There is (as of Java 7) a built in way to do this, Files.walkFileTree, which is much more efficient and removes the need to reinvent the wheel. It calls into a FileVisitor for every entry it finds. There are a couple of examples on the FileVisitor page to get you started.

Upvotes: 5

user207421
user207421

Reputation: 310869

Program slowing down due to recursion

No it isn't. Recursion doesn't make things slow. Poor algorithms and bad coding make things slow.

For example, you are calling Files.list() four times for every file you process, as well as once per directory. You can save an O(N) by doing that once per directory:

   for(File file : search.listFiles())
    {
        String name = file.getName();
        boolean include = true;
        for(String exception : exceptions)
            if(name.contains(exception))
                include = false;
        if(include)
        {
            list.add(name);
            if(file.isDirectory())
            {
                addEverythingUnder(file,list,exceptions);
            }
        }
    }

Upvotes: 5

FUD
FUD

Reputation: 5184

Is there a particular reason for reinventing the wheel? If you dont mind please use

http://commons.apache.org/proper/commons-io/apidocs/org/apache/commons/io/FileUtils.html#listFiles(java.io.File, java.lang.String[], boolean)

Upvotes: 2

Related Questions