Reputation: 63
I have a big .txt file (73 thousand lines) and i need to store the data from that file in an array.
Here is an example of how the file looks, it just repeats in this fashion, every line looks like this.
2016-05-27 11:04:16: QParRFSum=1574.00 QParRFSumMean=6.05 QParSuspSum=83.00
2016-05-27 11:14:07: QParRFSum=1537.00 QParRFSumMean=5.91 QParSuspSum=96.00
2016-05-27 11:14:07: QParRFSum=1537.00 QParRFSumMean=5.91 QParSuspSum=96.00
2016-05-27 11:24:07: QParRFSum=1405.00 QParRFSumMean=5.40 QParSuspSum=0.00
2016-05-27 11:24:07: QParRFSum=1405.00 QParRFSumMean=5.40 QParSuspSum=0.00
2016-05-27 11:34:06: QParRFSum=1533.00 QParRFSumMean=5.90 QParSuspSum=89.00
2016-05-27 11:34:06: QParRFSum=1533.00 QParRFSumMean=5.90 QParSuspSum=89.00
And I would like to put this into an array with keys like this:
Array (
[0] =>
(
[date] => 2016-05-27
[time] => 11:04:16
[QParRFSum] => 1574.00
[QParRFSumMean] => 6.05
[QParSuspSum] => 83.00
)
[1] =>
(
[date] => 2016-05-27
[time] => 11:14:07
[QParRFSum] => 1537.00
[QParRFSumMean] => 5.91
[QParSuspSum] => 96.00
)
[2] =>
(
[date] => 2016-05-27
[time] => 11:14:07
[QParRFSum] => 1537.00
[QParRFSumMean] => 5.91
[QParSuspSum] => 96.00
)
)
And so on...
How would i do this in the best way, having performance in mind?
My thought was to go through it line by line with the file() function and a foreach loop. And then somehow process it into the array structure I want. Something like this:
$txt=file('path/to/file');
foreach($txt as $line)
{
$RFAValues[] = $line;
}
Is this the way to go, or is there a better way which will give me better performance?
Upvotes: 0
Views: 77
Reputation:
I wrote from my memory but use something like that
$file = file_get_contents('file.txt');
$list = explode(PHP_EOL, $file);
$_temp = [];
foreach ( $list as $row )
{
$_explode = explode(' ', $row);
$_temp[] = [
'date' => $_explode[0],
'time' => rtrim($_explode[1], ':'),
'QParRFSum' => $_explode[2],
'QParRFSumMean' => $_explode[3],
'QParSuspSum' => $_explode[4]
];
}
var_dump( $_temp );
Upvotes: 1
Reputation: 5062
Personally, if you're handling huge files, I'd use a combination of generators
and file streams
. So you have your stream, and you turn that in to an array on demand using generators, so that you don't run out of memory. Maybe something like this
function readFromFile($fileName) {
$handle = @fopen("/tmp/inputfile.txt", "r");
while (($line = fgets($handle, 4096)) !== false) {
//Process Line
yield $line;
}
fclose($handle);
}
Upvotes: 0