Reputation: 2886
I have a large number of files in a directory and I'm using php to read it to a string. For example, a file's path looks like this: filerootdir/dir1/dir2/dir3/dir4/dir5/dir6/file.txt
.
I have a million such txt files. Based on different parameter, php will read the txt file and display it as a part of the webpage. I'm testing the php program on Windows 7 Pro right now. When a file's absolute path is short, e.g., filerootdir/dir1/file.txt
, it's pretty fast to load. But when the absolute path is long, it is VERY slow. I'm wondering if there is a better solution for this problem.
I'm testing my program under windows WAMP, but it will be moved to LAMP later eventually. Will the file loading program fun faster on linux servers? Could this be a problem of Windows operating system?
The code I'm using looks like the following:
if (file_exists($filePath.".html")) {
$code = file_get_contents($filePath.".html");
}
Thanks very much!
Upvotes: 0
Views: 196
Reputation: 5919
You might consider storing the data in a database - if you are using this number of records, especially if they are small files, a database will probably be more efficient. Before you do, read up on indexes - they can grab the right record out of billions in a tiny fraction of a second.
Upvotes: 1