Reputation: 301
I have a script for counting the number of downloads of a file that I downloaded from the internet. Basically it has a log file that it adds a line of text every time the file is downloaded.
I want to modify it slightly so that the first line is a number, and every time a file is downloaded it increases that number by one (So it's easy to see exactly how many people have downloaded it.)
Here's the code I have right now (this isn't working):
$conten = @file_get_contents(LOG_FILE);
//First line: $conten[0];
$content = fgets($conten);
$fo = @fopen(LOG_FILE, 'r+');
if ($fo) {
$content++;
@fputs($fo, ".$content.");
@fclose($fo);
}
$f = @fopen(LOG_FILE, 'a+');
if ($f) {
@fputs($f, date("m.d.Y g:ia")." ".$_SERVER['REMOTE_ADDR']." ".$fname."\n");
@fclose($f);
}
The $f part works fine...it's the parts above that which aren't working like I want.
Thanks!
Upvotes: 0
Views: 90
Reputation: 360702
file_get_contents()
sucks up the ENTIRE file into a string. You then try to do an fgets() on this string, which is incorrect - fgets() works on filehandles, not strings.
If you weren't suppressing errors with @
(NEVER a good idea), you'd most likely have seen PHP warn you about this. NEVER supress errors, especially while developing. It's like playing sports and saying "who cares if I broke my leg, I'm going to run this marathon now"/.
I'd STRONGLY suggest you use a database for this kind of thing. A simple
UPDATE downloads SET total = total + 1 WHERE file_id = XXX
is far safer than doing these file operations. Especially if the log file grows large. You'd be sucking an entire multi-megabyte file into memory, reading one value, then dumping it all out. Then again opening and rewriting the file. If two or more downloads complete at the same time, you're going to trash the log file with the conflicting reads/writes.
If you insist on a file-based operation, then look at using flock()
to restrict access by other parallel downloads while your script is doing the updates.
Upvotes: 2