anon
anon

Reputation:

What is a practical way to process, parse, and stream a large text file?

I currently have a log file stream (text format), which is constantly appended to by running process. I am using PHP to process it into JSON format, and then parsing it with jQuery's getJSON.

I am wondering what would be a practical way to fetch the data in the logfile. I've used jQuery's post function, which the file is too long to fetch. The function getJSON is fine, but the log file gets long enough that PHP can't process it, so it doesn't get passed to the function.

I have thought about limiting the amount of lines in the logfile (Tee'd from CentOS), and fetching a certain number of lines from the logfile (impractical for speed) but how would I do so?

Upvotes: 0

Views: 236

Answers (1)

Emil Vikström
Emil Vikström

Reputation: 91922

To read only the last part of the file, fseek to a good position and start from there. For example:

define('FILE', '/var/log/logfile');
define('SIZE', 1024*1024);
if (filesize(FILE) <= SIZE) {
  $text = file_get_contents(FILE);
} else {
  $fh = fopen(FILE, 'r');
  fseek($fh, -SIZE, SEEK_END);
  // Remove up to newline to avoid a broken line
  $skip = strlen(fgets($fh));
  $text = fread($fh, SIZE - $skip);
  fclose($fh);
}
// Do your work with $text here...

Upvotes: 1

Related Questions