SoLoGHoST
SoLoGHoST

Reputation: 2689

Fastest way possible to read contents of a file

Ok, I'm looking for the fastest possible way to read all of the contents of a file via php with a filepath on the server, also these files can be huge. So it's very important that it does a READ ONLY to it as fast as possible.

Is reading it line by line faster than reading the entire contents? Though, I remember reading up on this some, that reading the entire contents can produce errors for huge files. Is this true?

Upvotes: 31

Views: 61657

Answers (9)

Pascal MARTIN
Pascal MARTIN

Reputation: 400932

If you want to load the full-content of a file to a PHP variable, the easiest (and, probably fastest) way would be file_get_contents.

But, if you are working with big files, loading the whole file into memory might not be such a good idea : you'll probably end up with a memory_limit error, as PHP will not allow your script to use more than (usually) a couple mega-bytes of memory.

So, even if it's not the fastest solution, reading the file line by line (fopen+fgets+fclose), and working with those lines on the fly, without loading the whole file into memory, might be necessary...

Upvotes: 42

Sanjay Khatri
Sanjay Khatri

Reputation: 4211

$file_handle = fopen("myfile", "r");
while (!feof($file_handle)) {
   $line = fgets($file_handle);
   echo $line;
}
fclose($file_handle);
  1. Open the file and stores in $file_handle as reference to the file itself.
  2. Check whether you are already at the end of the file.
  3. Keep reading the file until you are at the end, printing each line as you read it.
  4. Close the file.

Upvotes: 12

ashraf mohammed
ashraf mohammed

Reputation: 1350

foreach (new SplFileObject($filepath) as $lineNumber => $lineContent) {

    echo $lineNumber."==>".$lineContent;  
    //process your operations here
}

Upvotes: 2

Keka_Umans
Keka_Umans

Reputation: 1

You Could Try cURL (http://php.net/manual/en/book.curl.php).

Altho You Might Want To Check, It Has Its Limits As Well

$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, "http://example.com/");
curl_setopt ($ch, CURLOPT_RETURNTRANSFER, 1);
$data = curl_exec ($ch); // Whole Page As String
curl_close ($ch);

Upvotes: -4

ppostma1
ppostma1

Reputation: 3676

If you're not worried about memory and file size,

$lines = file($path);

$lines is then the array of the file.

Upvotes: 1

ACME Squares
ACME Squares

Reputation: 33

Use fpassthru or readfile. Both use constant memory with increasing file size.

http://raditha.com/wiki/Readfile_vs_include

Upvotes: 2

Alix Axel
Alix Axel

Reputation: 154513

file_get_contents() is the most optimized way to read files in PHP, however - since you're reading files in memory you're always limited to the amount of memory available.

You can issue a ini_set('memory_limit', -1) if you have the right permissions but you'll still be limited by the amount of memory available on your system, this is common to all programming languages.

The only solution is to read the file in chunks, for that you can use file_get_contents() with the fourth and fifth arguments ($offset and $maxlen - specified in bytes):

string file_get_contents(string $filename[, bool $use_include_path = false[, resource $context[, int $offset = -1[, int $maxlen = -1]]]])

Here is an example where I use this technique to serve large download files:

public function Download($path, $speed = null)
{
    if (is_file($path) === true)
    {
        set_time_limit(0);

        while (ob_get_level() > 0)
        {
            ob_end_clean();
        }

        $size = sprintf('%u', filesize($path));
        $speed = (is_int($speed) === true) ? $size : intval($speed) * 1024;

        header('Expires: 0');
        header('Pragma: public');
        header('Cache-Control: must-revalidate, post-check=0, pre-check=0');
        header('Content-Type: application/octet-stream');
        header('Content-Length: ' . $size);
        header('Content-Disposition: attachment; filename="' . basename($path) . '"');
        header('Content-Transfer-Encoding: binary');

        for ($i = 0; $i <= $size; $i = $i + $speed)
        {
            ph()->HTTP->Flush(file_get_contents($path, false, null, $i, $speed));
            ph()->HTTP->Sleep(1);
        }

        exit();
    }

    return false;
}

Another option is the use the less optimized fopen(), feof(), fgets() and fclose() functions, specially if you care about getting whole lines at once, here is another example I provided in another StackOverflow question for importing large SQL queries into the database:

function SplitSQL($file, $delimiter = ';')
{
    set_time_limit(0);

    if (is_file($file) === true)
    {
        $file = fopen($file, 'r');

        if (is_resource($file) === true)
        {
            $query = array();

            while (feof($file) === false)
            {
                $query[] = fgets($file);

                if (preg_match('~' . preg_quote($delimiter, '~') . '\s*$~iS', end($query)) === 1)
                {
                    $query = trim(implode('', $query));

                    if (mysql_query($query) === false)
                    {
                        echo '<h3>ERROR: ' . $query . '</h3>' . "\n";
                    }

                    else
                    {
                        echo '<h3>SUCCESS: ' . $query . '</h3>' . "\n";
                    }

                    while (ob_get_level() > 0)
                    {
                        ob_end_flush();
                    }

                    flush();
                }

                if (is_string($query) === true)
                {
                    $query = array();
                }
            }

            return fclose($file);
        }
    }

    return false;
}

Which technique you use will really depend on what you're trying to do (as you can see with the SQL import function and the download function), but you'll always have to read the data in chunks.

Upvotes: 17

zaf
zaf

Reputation: 23244

Reading the whole file in one go is faster.

But huge files may eat up all your memory and cause problems. Then your safest bet is to read line by line.

Upvotes: 0

Sarfraz
Sarfraz

Reputation: 382676

You could use file_get_contents

Example:

$homepage = file_get_contents('http://www.example.com/');
echo $homepage;

Upvotes: 5

Related Questions