Reputation: 26360
What would be the fastest way to list the names of files from 1000+ directories and sub-directories?
EDIT; The current code I use is:
import java.io.File;
public class DirectoryReader {
static int spc_count=-1;
static void Process(File aFile) {
spc_count++;
String spcs = "";
for (int i = 0; i < spc_count; i++)
spcs += " ";
if(aFile.isFile())
System.out.println(spcs + "[FILE] " + aFile.getName());
else if (aFile.isDirectory()) {
System.out.println(spcs + "[DIR] " + aFile.getName());
File[] listOfFiles = aFile.listFiles();
if(listOfFiles!=null) {
for (int i = 0; i < listOfFiles.length; i++)
Process(listOfFiles[i]);
} else {
System.out.println(spcs + " [ACCESS DENIED]");
}
}
spc_count--;
}
public static void main(String[] args) {
String nam = "D:/";
File aFile = new File(nam);
Process(aFile);
}
}
Upvotes: 31
Views: 78687
Reputation: 1136
I have written a much simpler code....Try this... It will show every folder, subfolders and files...
int Files=0,Directory=0,HiddenFiles=0,HiddenDirectory=0;
public void listf(String directoryName){
File file=new File(directoryName);
File[] fileList=file.listFiles();
if(fileList!=null){
for(int i=0;i<fileList.length;i++){
if(fileList[i].isHidden()){
if(fileList[i].isFile())
{
System.out.println(fileList[i]);
HiddenFiles++;
}
else{
listf(String.valueOf(fileList[i]));
HiddenDirectory++;
}
}
else if (fileList[i].isFile()) {
//System.out.println(fileList[i]);
Files++;
}
else if(fileList[i].isDirectory()){
Directory++;
listf(String.valueOf(fileList[i]));
}
}
}
}
public void Numbers(){
System.out.println("Files: "+Files+" HiddenFiles: "+HiddenFiles+"Hidden Directories"+HiddenDirectory+" Directories: "+Directory);`
}
Upvotes: -1
Reputation: 1260
If you're open to using a 3rd party library, check out javaxt-core. It includes a multi-threaded recursive directory search that should be faster than iterating through one directory at a time. There are some examples here:
http://www.javaxt.com/javaxt-core/io/Directory/Recursive_Directory_Search
Upvotes: 4
Reputation: 11547
As this answer shows up on top of google, i'm adding a java 7 nio solution for listing all files and directories, it is takes about 80% less time on my system.
try {
Path startPath = Paths.get("c:/");
Files.walkFileTree(startPath, new SimpleFileVisitor<Path>() {
@Override
public FileVisitResult preVisitDirectory(Path dir,
BasicFileAttributes attrs) {
System.out.println("Dir: " + dir.toString());
return FileVisitResult.CONTINUE;
}
@Override
public FileVisitResult visitFile(Path file, BasicFileAttributes attrs) {
System.out.println("File: " + file.toString());
return FileVisitResult.CONTINUE;
}
@Override
public FileVisitResult visitFileFailed(Path file, IOException e) {
return FileVisitResult.CONTINUE;
}
});
} catch (IOException e) {
e.printStackTrace();
}
Upvotes: 36
Reputation: 45596
The only improvement is to get rid of static spc_count
and pass spcs
string as a parameter to Process.
public static void main(String[] args) {
String nam = "D:/";
File aFile = new File(nam);
Process("", aFile);
}
And when doing recursive call, do
static void Process( String spcs, File aFile) {
...
Process(spcs + " ", listOfFiles[i]);
...
}
This way you can call this method from more than 1 thread.
Upvotes: 5
Reputation: 88816
Until Java 7 introduces the new java.nio.file classes (like DirectoryStream
), I'm afraid what you already have will be the fastest.
Upvotes: 4
Reputation: 24375
This looks fine (Recursively going through the directory) The bottleneck will be all the file i/o you need to do, optimizing your Java will not show any real improvements.
Upvotes: 9